NESUN

NESUN - Nesterov’s universal gradient method: Universal gradient methods for convex optimization problems. In this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results by encouraging numerical experiments, which demonstrate that the fast rate of convergence, typical for the smooth optimization problems, sometimes can be achieved even on nonsmooth problem instances.


References in zbMATH (referenced in 32 articles )

Showing results 1 to 20 of 32.
Sorted by year (citations)

1 2 next

  1. Nesterov, Yurii: Implementable tensor methods in unconstrained convex optimization (2021)
  2. Berger, Guillaume O.; Absil, P.-A.; Jungers, Raphaël M.; Nesterov, Yurii: On the quality of first-order approximation of functions with Hölder continuous gradient (2020)
  3. Lei, Lihua; Jordan, Michael I.: On the adaptivity of stochastic gradient-based optimization (2020)
  4. Rodomanov, Anton; Nesterov, Yurii: Smoothness parameter of power of Euclidean norm (2020)
  5. Roulet, Vincent; d’Aspremont, Alexandre: Sharpness, restart, and acceleration (2020)
  6. Scieur, Damien; D’Aspremont, Alexandre; Bach, Francis: Regularized nonlinear acceleration (2020)
  7. Silveti-Falls, Antonio; Molinari, Cesare; Fadili, Jalal: Generalized conditional gradient with augmented Lagrangian for composite minimization (2020)
  8. Ahookhosh, Masoud: Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity (2019)
  9. Ahookhosh, Masoud; Neumaier, Arnold: An optimal subgradient algorithm with subspace search for costly convex optimization problems (2019)
  10. Baimurzina, D. R.; Gasnikov, A. V.; Gasnikova, E. V.; Dvurechensky, P. E.; Ershov, E. I.; Kubentaeva, M. B.; Lagunovskaya, A. A.: Universal method of searching for equilibria and stochastic equilibria in transportation networks (2019)
  11. Cartis, Coralia; Gould, Nick I.; Toint, Philippe L.: Universal regularization methods: varying the power, the smoothness and the accuracy (2019)
  12. Davis, Damek; Drusvyatskiy, Dmitriy: Stochastic model-based minimization of weakly convex functions (2019)
  13. Diakonikolas, Jelena; Orecchia, Lorenzo: The approximate duality gap technique: a unified theory of first-order methods (2019)
  14. Drusvyatskiy, D.; Paquette, C.: Efficiency of minimizing compositions of convex functions and smooth maps (2019)
  15. Gasnikov, A. V.; Dvurechensky, P. E.; Stonyakin, F. S.; Titov, A. A.: An adaptive proximal method for variational inequalities (2019)
  16. Gasnikov, A. V.; Tyurin, A. I.: Fast gradient descent for convex minimization problems with an oracle producing a (( \delta, L))-model of function at the requested point (2019)
  17. Guminov, Sergey; Gasnikov, Alexander; Anikin, Anton; Gornov, Alexander: A universal modification of the linear coupling method (2019)
  18. Ho, Chin Pang; Parpas, Panos: Empirical risk minimization: probabilistic complexity and stepsize strategy (2019)
  19. Krutikov, V. N.; Samoilenko, N. S.; Meshechkin, V. V.: On the properties of the method of minimization for convex functions with relaxation on the distance to extremum (2019)
  20. Renegar, James: Accelerated first-order methods for hyperbolic programming (2019)

1 2 next