SuperMann: a superlinearly convergent algorithm for finding fixed points of nonexpansive operators. Operator splitting techniques have recently gained popularity in convex optimization problems arising in various control fields. Being fixed-point iterations of nonexpansive operators, such methods suffer many well known downsides, which include high sensitivity to ill conditioning and parameter selection, and consequent low accuracy and robustness. As universal solution we propose SuperMann, a Newton-type algorithm for finding fixed points of nonexpansive operators. It generalizes the classical Krasnosel’skii-Mann scheme, enjoys its favorable global convergence properties and requires exactly the same oracle. It is based on a novel separating hyperplane projection tailored for nonexpansive mappings which makes it possible to include steps along any direction. In particular, when the directions satisfy a Dennis-Moré condition we show that SuperMann converges superlinearly under mild assumptions, which, surprisingly, do not entail nonsingularity of the Jacobian at the solution but merely metric subregularity. As a result, SuperMann enhances and robustifies all operator splitting schemes for structured convex optimization, overcoming their well known sensitivity to ill conditioning.

References in zbMATH (referenced in 12 articles )

Showing results 1 to 12 of 12.
Sorted by year (citations)

  1. Akian, Marianne; Gaubert, Stéphane; Qu, Zheng; Saadi, Omar: Multiply accelerated value iteration for nonsymmetric affine fixed point problems and application to Markov decision processes (2022)
  2. De Marchi, Alberto: On a primal-dual Newton proximal method for convex quadratic programs (2022)
  3. Themelis, Andreas; Stella, Lorenzo; Patrinos, Panagiotis: Douglas-Rachford splitting and ADMM for nonconvex optimization: accelerated and Newton-type linesearch algorithms (2022)
  4. Ahookhosh, Masoud; Themelis, Andreas; Patrinos, Panagiotis: A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima (2021)
  5. O’Donoghue, Brendan: Operator splitting for a homogeneous embedding of the linear complementarity problem (2021)
  6. Fu, Anqi; Zhang, Junzi; Boyd, Stephen: Anderson accelerated Douglas-Rachford splitting (2020)
  7. Li, Xiuxian; Xie, Lihua: Distributed algorithms for computing a fixed point of multi-agent nonexpansive operators (2020)
  8. Malitsky, Yura: Golden ratio algorithms for variational inequalities (2020)
  9. Zhang, Junzi; O’Donoghue, Brendan; Boyd, Stephen: Globally convergent type-I Anderson acceleration for nonsmooth fixed-point iterations (2020)
  10. Zheng, Yang; Fantuzzi, Giovanni; Papachristodoulou, Antonis; Goulart, Paul; Wynn, Andrew: Chordal decomposition in operator-splitting methods for sparse semidefinite programs (2020)
  11. Busseti, Enzo; Moursi, Walaa M.; Boyd, Stephen: Solution refinement at regular points of conic problems (2019)
  12. Themelis, Andreas; Patrinos, Panagiotis: SuperMann: a superlinearly convergent algorithm for finding fixed points of nonexpansive operators (2019)