SCALCG

SCALCG – Scaled conjugate gradient algorithms for unconstrained optimization. In this work we present and analyze a new scaled conjugate gradient algorithm and its implementation, based on an interpretation of the secant equation and on the inexact Wolfe line search conditions. The best spectral conjugate gradient algorithm SCG by Birgin and Martínez (2001), which is mainly a scaled variant of Perry’s (1977), is modified in such a manner to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newton BFGS updating formula. The computational scheme is embedded in the restart philosophy of Beale–Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative manner by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is global convergent. Preliminary computational results, for a set consisting of 500 unconstrained optimization test problems, show that this new scaled conjugate gradient algorithm substantially outperforms the spectral conjugate gradient SCG algorithm.


References in zbMATH (referenced in 58 articles )

Showing results 1 to 20 of 58.
Sorted by year (citations)

1 2 3 next

  1. Andrei, Neculai: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update (2017)
  2. Andrei, Neculai: Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization (2017)
  3. Babaie-Kafaki, Saman; Ghanbari, Reza: A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update (2017)
  4. Xu, Ling; Ding, Feng: Recursive least squares and multi-innovation stochastic gradient parameter estimation methods for signal modeling (2017)
  5. Andrei, Neculai: An adaptive conjugate gradient algorithm for large-scale unconstrained optimization (2016)
  6. Andrei, Neculai: A new adaptive conjugate gradient algorithm for large-scale unconstrained optimization (2016)
  7. Babaie-Kafaki, Saman: A modified scaling parameter for the memoryless BFGS updating formula (2016)
  8. Babaie-Kafaki, Saman; Ghanbari, Reza: A descent hybrid modification of the Polak-Ribière-Polyak conjugate gradient method (2016)
  9. Zhang, Yang; Dan, Bin: An efficient adaptive scaling parameter for the spectral conjugate gradient method (2016)
  10. Andrei, Neculai: A new three-term conjugate gradient algorithm for unconstrained optimization (2015)
  11. Babaie-Kafaki, Saman: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae (2015)
  12. Dong, XiaoLiang; Liu, Hongwei; He, Yubo: A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition (2015)
  13. Dong, Xiao Liang; Liu, Hongwei; Xu, Yin Ling; Yang, Xi Mei: Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence (2015)
  14. Huang, Shuai; Wan, Zhong; Chen, Xiaohong: A new nonmonotone line search technique for unconstrained optimization (2015)
  15. Liu, Hao; Wang, Haijun; Qian, Xiaoyan; Rao, Feng: A conjugate gradient method with sufficient descent property (2015)
  16. Livieris, Ioannis E.; Pintelas, Panagiotis: A modified Perry conjugate gradient method and its global convergence (2015)
  17. Babaie-Kafaki, Saman: Two modified scaled nonlinear conjugate gradient methods (2014)
  18. Zhao, Lijuan; Sun, Wenyu; De Sampaio, Raimundo J.B.: Nonmonotone adaptive trust region method based on simple conic model for unconstrained optimization (2014)
  19. Andrei, Neculai: A simple three-term conjugate gradient algorithm for unconstrained optimization (2013)
  20. Andrei, Neculai: Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization (2013)

1 2 3 next