GradSamp

A robust gradient sampling algorithm for nonsmooth, nonconvex optimization The authors describe a practical and robust algorithm for computing the local minima of a continuously differentiable function in n real variables, which is not convex and not even locally Lipschitz. The only request formulated is that the gradient of the function is easily computed where it is defined. (Source: http://plato.asu.edu)


References in zbMATH (referenced in 79 articles , 1 standard article )

Showing results 1 to 20 of 79.
Sorted by year (citations)

1 2 3 4 next

  1. Keskar, N.; Wächter, Andreas: A limited-memory quasi-Newton algorithm for bound-constrained non-smooth optimization (2019)
  2. Dolgopolik, M. V.: A convergence analysis of the method of codifferential descent (2018)
  3. Fiege, Sabrina; Walther, Andrea; Kulshreshtha, Kshitij; Griewank, Andreas: Algorithmic differentiation for piecewise smooth functions: a case study for robust optimization (2018)
  4. Hejazi, M. Alavi; Movahedian, N.; Nobakhtian, S.: On constraint qualifications and sensitivity analysis for general optimization problems via pseudo-Jacobians (2018)
  5. Helou, Elias S.; Santos, Sandra A.; Simões, Lucas E. A.: A fast gradient and function sampling method for finite-max functions (2018)
  6. Hoseini, N.; Nobakhtian, S.: A new trust region method for nonsmooth nonconvex optimization (2018)
  7. Jian, Jin-bao; Tang, Chun-ming; Shi, Lu: A feasible point method with bundle modification for nonsmooth convex constrained optimization (2018)
  8. Kazemi, Sajjad; Kanzi, Nader: Constraint qualifications and stationary conditions for mathematical programming with non-differentiable vanishing constraints (2018)
  9. Knossalla, Martin: Minimization of marginal functions in mathematical programming based on continuous outer subdifferentials (2018)
  10. Mengi, Emre: Large-scale and global maximization of the distance to instability (2018)
  11. Apkarian, Pierre; Noll, Dominikus: Worst-case stability and performance with mixed parametric and dynamic uncertainties (2017)
  12. Helou, Elias Salomão; Santos, Sandra A.; Simões, Lucas E. A.: On the local convergence analysis of the gradient sampling method for finite max-functions (2017)
  13. Hosseini, Seyedehsomayeh; Uschmajew, André: A Riemannian gradient sampling algorithm for nonsmooth optimization on manifolds (2017)
  14. Kungurtsev, Vyacheslav; Michiels, Wim; Diehl, Moritz: An SL/QP algorithm for minimizing the spectral abscissa of time delay systems (2017)
  15. Loreto, Milagros; Aponte, Hugo; Cores, Debora; Raydan, Marcos: Nonsmooth spectral gradient methods for unconstrained optimization (2017)
  16. Mahdavi-Amiri, N.; Shaeiri, M.: An adaptive competitive penalty method for nonsmooth constrained optimization (2017)
  17. Poirion, Fabrice; Mercier, Quentin; Désidéri, Jean-Antoine: Descent algorithm for nonsmooth stochastic multiobjective optimization (2017)
  18. Price, C. J.: A direct search quasi-Newton method for nonsmooth unconstrained optimization (2017)
  19. Sun, Hailin; Su, Che-Lin; Chen, Xiaojun: SAA-regularized methods for multiproduct price optimization under the pure characteristics demand model (2017)
  20. Welper, G.: Interpolation of functions with parameter dependent jumps by transformed snapshots (2017)

1 2 3 4 next