NESUN - Nesterov’s universal gradient method: Universal gradient methods for convex optimization problems. In this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results by encouraging numerical experiments, which demonstrate that the fast rate of convergence, typical for the smooth optimization problems, sometimes can be achieved even on nonsmooth problem instances.

References in zbMATH (referenced in 16 articles )

Showing results 1 to 16 of 16.
Sorted by year (citations)

  1. Ahookhosh, Masoud; Neumaier, Arnold: An optimal subgradient algorithm with subspace search for costly convex optimization problems (2019)
  2. Cartis, Coralia; Gould, Nick I.; Toint, Philippe L.: Universal regularization methods: varying the power, the smoothness and the accuracy (2019)
  3. Davis, Damek; Drusvyatskiy, Dmitriy: Stochastic model-based minimization of weakly convex functions (2019)
  4. Diakonikolas, Jelena; Orecchia, Lorenzo: The approximate duality gap technique: a unified theory of first-order methods (2019)
  5. Renegar, James: Accelerated first-order methods for hyperbolic programming (2019)
  6. Ahookhosh, Masoud; Neumaier, Arnold: Solving structured nonsmooth convex optimization with complexity (\mathcalO(\varepsilon^-1/2)) (2018)
  7. d’Aspremont, Alexandre; Guzmán, Cristóbal; Jaggi, Martin: Optimal affine-invariant smooth minimization algorithms (2018)
  8. Gasnikov, A. V.; Nesterov, Yu. E.: Universal method for stochastic composite optimization problems (2018)
  9. Ahookhosh, Masoud; Neumaier, Arnold: Optimal subgradient algorithms for large-scale convex optimization in simple domains (2017)
  10. Chen, Yunmei; Lan, Guanghui; Ouyang, Yuyuan: Accelerated schemes for a class of variational inequalities (2017)
  11. Grapiglia, G. N.; Nesterov, Yurii: Regularized Newton methods for minimizing functions with Hölder continuous hessians (2017)
  12. Nesterov, Yurii; Stich, Sebastian U.: Efficiency of the accelerated coordinate descent method on structured optimization problems (2017)
  13. Ito, Masaru: New results on subgradient methods for strongly convex optimization problems with a unified analysis (2016)
  14. Kočvara, Michal; Nesterov, Yurii; Xia, Yu: A subgradient method for free material design (2016)
  15. Yashtini, Maryam: On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients (2016)
  16. Nesterov, Yu: Universal gradient methods for convex optimization problems (2015)