PESTO

Code of the Performance Estimation Toolbox (PESTO) whose aim is to ease the access to the PEP methodology. The numerical worst-case analyses from PEP can be performed just by writting the algorithms just as you would implement them: Exact worst-case performance of first-order methods for composite convex optimization. We provide a framework for computing the exact worst-case performance of any algorithm belonging to a broad class of oracle-based first-order methods for composite convex optimization, including those performing explicit, projected, proximal, conditional and inexact (sub)gradient steps. We simultaneously obtain tight worst-case guarantees and explicit instances of optimization problems on which the algorithm reaches this worst-case. We achieve this by reducing the computation of the worst-case to solving a convex semidefinite program, generalizing previous works on performance estimation by Drori and Teboulle [13] and the authors [43]. We use these developments to obtain a tighter analysis of the proximal point algorithm and of several variants of fast proximal gradient, conditional gradient, subgradient and alternating projection methods. In particular, we present a new analytical worst-case guarantee for the proximal point algorithm that is twice better than previously known, and improve the standard worst-case guarantee for the conditional gradient method by more than a factor of two. We also show how the optimized gradient method proposed by Kim and Fessler in [22] can be extended by incorporating a projection or a proximal operator, which leads to an algorithm that converges in the worst-case twice as fast as the standard accelerated proximal gradient method [2]


References in zbMATH (referenced in 33 articles )

Showing results 1 to 20 of 33.
Sorted by year (citations)

1 2 next

  1. Bachir, M.; Fabre, A.; Tapia-García, S.: Finitely determined functions (2021)
  2. Hu, Bin; Seiler, Peter; Lessard, Laurent: Analysis of biased stochastic gradient descent using sequential semidefinite programs (2021)
  3. Kim, Donghwan; Fessler, Jeffrey A.: Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions (2021)
  4. Lieder, Felix: On the convergence rate of the Halpern-iteration (2021)
  5. Li, Zhongguo; Dong, Zhen; Liang, Zhongchao; Ding, Zhengtao: Surrogate-based distributed optimisation for expensive black-box functions (2021)
  6. Madden, Liam; Becker, Stephen; Dall’Anese, Emiliano: Bounds for the tracking error of first-order online optimization methods (2021)
  7. Tan, Sandra S. Y.; Varvitsiotis, Antonios; Tan, Vincent Y. F.: Analysis of optimization algorithms via sum-of-squares (2021)
  8. Wilson, Ashia C.; Recht, Ben; Jordan, Michael I.: A Lyapunov analysis of accelerated methods in optimization (2021)
  9. Zhang, Guodong; Bao, Xuchan; Lessard, Laurent; Grosse, Roger: A unified analysis of first-order methods for smooth games via integral quadratic constraints (2021)
  10. Alkousa, M. S.; Gasnikov, A. V.; Dvinskikh, D. M.; Kovalev, D. A.; Stonyakin, F. S.: Accelerated methods for saddle-point problem (2020)
  11. Asl, Azam; Overton, Michael L.: Analysis of the gradient method with an Armijo-Wolfe line search on a class of non-smooth convex functions (2020)
  12. Banert, Sebastian; Ringh, Axel; Adler, Jonas; Karlsson, Johan; Öktem, Ozan: Data-driven nonsmooth optimization (2020)
  13. De Klerk, Etienne; Glineur, François; Taylor, Adrien B.: Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation (2020)
  14. Drori, Yoel: On the properties of convex functions over open sets (2020)
  15. Drori, Yoel; Taylor, Adrien B.: Efficient first-order methods for convex minimization: a constructive approach (2020)
  16. Gu, Guoyong; Yang, Junfeng: Tight sublinear convergence rate of the proximal point algorithm for maximal monotone inclusion problems (2020)
  17. Rieger, Janosch; Tam, Matthew K.: Backward-forward-reflected-backward splitting for three operator monotone inclusions (2020)
  18. Ryu, Ernest K.; Taylor, Adrien B.; Bergeling, Carolina; Giselsson, Pontus: Operator splitting performance estimation: tight contraction factors and optimal parameter selection (2020)
  19. Ryu, Ernest K.; Vũ, Bằng Công: Finding the forward-Douglas-Rachford-forward method (2020)
  20. Zhang, Hui: New analysis of linear convergence of gradient-type methods via unifying error bound conditions (2020)

1 2 next