PESTO
Code of the Performance Estimation Toolbox (PESTO) whose aim is to ease the access to the PEP methodology. The numerical worst-case analyses from PEP can be performed just by writting the algorithms just as you would implement them: Exact worst-case performance of first-order methods for composite convex optimization. We provide a framework for computing the exact worst-case performance of any algorithm belonging to a broad class of oracle-based first-order methods for composite convex optimization, including those performing explicit, projected, proximal, conditional and inexact (sub)gradient steps. We simultaneously obtain tight worst-case guarantees and explicit instances of optimization problems on which the algorithm reaches this worst-case. We achieve this by reducing the computation of the worst-case to solving a convex semidefinite program, generalizing previous works on performance estimation by Drori and Teboulle [13] and the authors [43]. We use these developments to obtain a tighter analysis of the proximal point algorithm and of several variants of fast proximal gradient, conditional gradient, subgradient and alternating projection methods. In particular, we present a new analytical worst-case guarantee for the proximal point algorithm that is twice better than previously known, and improve the standard worst-case guarantee for the conditional gradient method by more than a factor of two. We also show how the optimized gradient method proposed by Kim and Fessler in [22] can be extended by incorporating a projection or a proximal operator, which leads to an algorithm that converges in the worst-case twice as fast as the standard accelerated proximal gradient method [2]
Keywords for this software
References in zbMATH (referenced in 36 articles )
Showing results 1 to 20 of 36.
Sorted by year (- Drori, Yoel; Taylor, Adrien: On the oracle complexity of smooth strongly convex minimization (2022)
- Salzo, Saverio; Villa, Silvia: Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis (2022)
- Bachir, M.; Fabre, A.; Tapia-García, S.: Finitely determined functions (2021)
- Hildebrand, Roland: Optimal step length for the Newton method: case of self-concordant functions (2021)
- Hu, Bin; Seiler, Peter; Lessard, Laurent: Analysis of biased stochastic gradient descent using sequential semidefinite programs (2021)
- Kim, Donghwan; Fessler, Jeffrey A.: Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions (2021)
- Lieder, Felix: On the convergence rate of the Halpern-iteration (2021)
- Li, Zhongguo; Dong, Zhen; Liang, Zhongchao; Ding, Zhengtao: Surrogate-based distributed optimisation for expensive black-box functions (2021)
- Madden, Liam; Becker, Stephen; Dall’Anese, Emiliano: Bounds for the tracking error of first-order online optimization methods (2021)
- Tan, Sandra S. Y.; Varvitsiotis, Antonios; Tan, Vincent Y. F.: Analysis of optimization algorithms via sum-of-squares (2021)
- Wilson, Ashia C.; Recht, Ben; Jordan, Michael I.: A Lyapunov analysis of accelerated methods in optimization (2021)
- Zhang, Guodong; Bao, Xuchan; Lessard, Laurent; Grosse, Roger: A unified analysis of first-order methods for smooth games via integral quadratic constraints (2021)
- Alkousa, M. S.; Gasnikov, A. V.; Dvinskikh, D. M.; Kovalev, D. A.; Stonyakin, F. S.: Accelerated methods for saddle-point problem (2020)
- Asl, Azam; Overton, Michael L.: Analysis of the gradient method with an Armijo-Wolfe line search on a class of non-smooth convex functions (2020)
- Banert, Sebastian; Ringh, Axel; Adler, Jonas; Karlsson, Johan; Öktem, Ozan: Data-driven nonsmooth optimization (2020)
- De Klerk, Etienne; Glineur, François; Taylor, Adrien B.: Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation (2020)
- Drori, Yoel: On the properties of convex functions over open sets (2020)
- Drori, Yoel; Taylor, Adrien B.: Efficient first-order methods for convex minimization: a constructive approach (2020)
- Gu, Guoyong; Yang, Junfeng: Tight sublinear convergence rate of the proximal point algorithm for maximal monotone inclusion problems (2020)
- Rieger, Janosch; Tam, Matthew K.: Backward-forward-reflected-backward splitting for three operator monotone inclusions (2020)