CG_DESCENT

Algorithm 851: CG_DESCENT. A conjugate gradient method with guaranteed descent Recently, a new nonlinear conjugate gradient scheme was developed which satisfies the descent condition gTkdk ≤ −7/8 ‖gk‖2 and which is globally convergent whenever the line search fulfills the Wolfe conditions. This article studies the convergence behavior of the algorithm; extensive numerical tests and comparisons with other methods for large-scale unconstrained optimization are given.

This software is also peer reviewed by journal TOMS.


References in zbMATH (referenced in 71 articles , 1 standard article )

Showing results 1 to 20 of 71.
Sorted by year (citations)

1 2 3 4 next

  1. Gnandt, Christian; Callies, Rainer: CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method (2018)
  2. Li, Min: A modified Hestense-Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method (2018)
  3. Li, Xiangrong; Wang, Xiaoliang; Sheng, Zhou; Duan, Xiabin: A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations (2018)
  4. Andrei, Neculai: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update (2017)
  5. Babaie-Kafaki, Saman; Ghanbari, Reza: A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update (2017)
  6. Babaie-Kafaki, Saman; Ghanbari, Reza: A class of adaptive dai-liao conjugate gradient methods based on the scaled memoryless BFGS update (2017)
  7. Chen, Jingrun; García-Cervera, Carlos J.: An efficient multigrid strategy for large-scale molecular mechanics optimization (2017)
  8. Huang, Yuanyuan; Liu, Changhe: Dai-Kou type conjugate gradient methods with a line search only using gradient (2017)
  9. Uzunca, Murat; Küçükseyhan, Tuğba; Yücel, Hamdullah; Karasözen, Bülent: Optimal control of convective FitzHugh-Nagumo equation (2017)
  10. Wu, Yanlin: A modified three-term PRP conjugate gradient algorithm for optimization models (2017)
  11. Ziadi, Raouf; Ellaia, Rachid; Bencherif-Madani, Abdelatif: Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method (2017)
  12. Andrei, Neculai: An adaptive conjugate gradient algorithm for large-scale unconstrained optimization (2016)
  13. Babaie-Kafaki, Saman: On optimality of two adaptive choices for the parameter of Dai-Liao method (2016)
  14. Babaie-Kafaki, Saman; Ghanbari, Reza: A descent hybrid modification of the Polak-Ribière-Polyak conjugate gradient method (2016)
  15. Du, Xuewu; Zhang, Peng; Ma, Wenya: Some modified conjugate gradient methods for unconstrained optimization (2016)
  16. Fatemi, M.: An optimal parameter for Dai-Liao family of conjugate gradient methods (2016)
  17. Iiduka, Hideaki: Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization (2016)
  18. Zhang, Yang; Dan, Bin: An efficient adaptive scaling parameter for the spectral conjugate gradient method (2016)
  19. Al-Baali, Mehiddin; Narushima, Yasushi; Yabe, Hiroshi: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization (2015)
  20. Huang, Shuai; Wan, Zhong; Chen, Xiaohong: A new nonmonotone line search technique for unconstrained optimization (2015)

1 2 3 4 next