SCALCG – Scaled conjugate gradient algorithms for unconstrained optimization. In this work we present and analyze a new scaled conjugate gradient algorithm and its implementation, based on an interpretation of the secant equation and on the inexact Wolfe line search conditions. The best spectral conjugate gradient algorithm SCG by Birgin and Martínez (2001), which is mainly a scaled variant of Perry’s (1977), is modified in such a manner to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newton BFGS updating formula. The computational scheme is embedded in the restart philosophy of Beale–Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative manner by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is global convergent. Preliminary computational results, for a set consisting of 500 unconstrained optimization test problems, show that this new scaled conjugate gradient algorithm substantially outperforms the spectral conjugate gradient SCG algorithm.

References in zbMATH (referenced in 100 articles )

Showing results 1 to 20 of 100.
Sorted by year (citations)

1 2 3 4 5 next

  1. Andrei, Neculai: A double parameter self-scaling memoryless BFGS method for unconstrained optimization (2020)
  2. Andrei, Neculai: New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method (2020)
  3. Babaie-Kafaki, Saman: A modified scaled memoryless symmetric rank-one method (2020)
  4. Bojari, S.; Eslahchi, M. R.: Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization (2020)
  5. Mahdavi-Amiri, N.; Shaeiri, M.: A conjugate gradient sampling method for nonsmooth optimization (2020)
  6. Nataj, Sarah; Lui, S. H.: Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point (2020)
  7. Ou, Yigui; Lin, Haichan: A class of accelerated conjugate-gradient-like methods based on a modified secant equation (2020)
  8. Sellami, Badreddine; Chiheb Eddine Sellami, Mohamed: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Wolfe line search (2020)
  9. Waziri, M. Y.; Ahmed, K.; Sabi’u, J.: A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations (2020)
  10. Yuan, Gonglin; Li, Tingting; Hu, Wujie: A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems (2020)
  11. Aminifard, Zohre; Babaie-Kafaki, Saman: An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix (2019)
  12. Babaie-Kafaki, Saman; Aminifard, Zohre: Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length (2019)
  13. Dehghani, R.; Mahdavi-Amiri, N.: Scaled nonlinear conjugate gradient methods for nonlinear least squares problems (2019)
  14. Faramarzi, Parvaneh; Amini, Keyvan: A modified spectral conjugate gradient method with global convergence (2019)
  15. Khoshgam, Zahra; Ashrafi, Ali: A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function (2019)
  16. Liu, Hongwei; Liu, Zexian: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization (2019)
  17. Liu, J. K.; Feng, Y. M.; Zou, L. M.: A spectral conjugate gradient method for solving large-scale unconstrained optimization (2019)
  18. Rezaee, Saeed; Babaie-Kafaki, Saman: An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems (2019)
  19. Xue, Yanqin; Liu, Hongwei; Liu, Zexian: An improved nonmonotone adaptive trust region method. (2019)
  20. Andrei, Neculai: A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization (2018)

1 2 3 4 5 next