SCALCG

SCALCG – Scaled conjugate gradient algorithms for unconstrained optimization. In this work we present and analyze a new scaled conjugate gradient algorithm and its implementation, based on an interpretation of the secant equation and on the inexact Wolfe line search conditions. The best spectral conjugate gradient algorithm SCG by Birgin and Martínez (2001), which is mainly a scaled variant of Perry’s (1977), is modified in such a manner to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newton BFGS updating formula. The computational scheme is embedded in the restart philosophy of Beale–Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative manner by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is global convergent. Preliminary computational results, for a set consisting of 500 unconstrained optimization test problems, show that this new scaled conjugate gradient algorithm substantially outperforms the spectral conjugate gradient SCG algorithm.


References in zbMATH (referenced in 47 articles )

Showing results 1 to 20 of 47.
Sorted by year (citations)

1 2 3 next

  1. Babaie-Kafaki, Saman: A modified scaling parameter for the memoryless BFGS updating formula (2016)
  2. Babaie-Kafaki, Saman; Ghanbari, Reza: A descent hybrid modification of the Polak-Ribière-Polyak conjugate gradient method (2016)
  3. Zhang, Yang; Dan, Bin: An efficient adaptive scaling parameter for the spectral conjugate gradient method (2016)
  4. Andrei, Neculai: A new three-term conjugate gradient algorithm for unconstrained optimization (2015)
  5. Babaie-Kafaki, Saman: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae (2015)
  6. Dong, XiaoLiang; Liu, Hongwei; He, Yubo: A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition (2015)
  7. Dong, Xiao Liang; Liu, Hongwei; Xu, Yin Ling; Yang, Xi Mei: Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence (2015)
  8. Huang, Shuai; Wan, Zhong; Chen, Xiaohong: A new nonmonotone line search technique for unconstrained optimization (2015)
  9. Liu, Hao; Wang, Haijun; Qian, Xiaoyan; Rao, Feng: A conjugate gradient method with sufficient descent property (2015)
  10. Livieris, Ioannis E.; Pintelas, Panagiotis: A modified Perry conjugate gradient method and its global convergence (2015)
  11. Babaie-Kafaki, Saman: Two modified scaled nonlinear conjugate gradient methods (2014)
  12. Zhao, Lijuan; Sun, Wenyu; De Sampaio, Raimundo J.B.: Nonmonotone adaptive trust region method based on simple conic model for unconstrained optimization (2014)
  13. Andrei, Neculai: On three-term conjugate gradient algorithms for unconstrained optimization (2013)
  14. Andrei, Neculai: Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization (2013)
  15. Andrei, Neculai: Nonlinear optimization applications using the GAMS technology (2013)
  16. Andrei, Neculai: A simple three-term conjugate gradient algorithm for unconstrained optimization (2013)
  17. Babaie-Kafaki, Saman: A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization (2013)
  18. Babaie-Kafaki, Saman: On the sufficient descent property of the Shanno’s conjugate gradient method (2013)
  19. Deng, Songhai; Wan, Zhong; Chen, Xiaohong: An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems (2013)
  20. Liu, Dongyi; Xu, Genqi: Symmetric Perry conjugate gradient method (2013)

1 2 3 next