ACGSSV

Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. An accelerated adaptive class of nonlinear conjugate gradient algorithms is suggested. The search direction in these algorithms is given by symmetrization of the scaled Perry conjugate gradient direction (Perry, 1978), which depends on a positive parameter. The value of this parameter is determined by minimizing the distance between the symmetrical scaled Perry conjugate gradient search direction matrix and the self-scaling memoryless BFGS update by Oren in the Frobenius norm. Two variants of the parameter in the search direction are presented as those given by: Oren and Luenberger (1973/74) and Oren and Spedicato (1976). The corresponding algorithm, ACGSSV, is equipped with a very well known acceleration scheme of conjugate gradient algorithms. The global convergence of the algorithm is given both for uniformly convex and general nonlinear functions under the exact or the Wolfe line search. Using a set of 800 unconstrained optimization test problems, of different structure and complexity, we prove that selection of the scaling parameter in self-scaling memoryless BFGS update leads to algorithms which substantially outperform the CG-DESCENT, SCALCG, and CONMIN conjugate gradient algorithms, being more efficient and more robust. However, the conjugate gradient algorithm ADCG based on clustering the eigenvalues of the iteration matrix defined by the search direction is more efficient and slightly more robust than our ACGSSV algorithm. By solving five applications from the MINPACK-2 test problem collection with variables, we show that the adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, endowed with the acceleration scheme, is top performer versus CG_DESCENT.


References in zbMATH (referenced in 11 articles )

Showing results 1 to 11 of 11.
Sorted by year (citations)

  1. Andrei, Neculai: A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization (2022)
  2. Kaelo, P.; Koorapetse, M.; Sam, C. R.: A globally convergent derivative-free projection method for nonlinear monotone equations with applications (2021)
  3. Bojari, S.; Eslahchi, M. R.: Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization (2020)
  4. Liu, Meixing; Ma, Guodong; Yin, Jianghua: Two new conjugate gradient methods for unconstrained optimization (2020)
  5. Waziri, Mohammed Yusuf; Hungu, Kabiru Ahmed; Sabi’u, Jamilu: Descent Perry conjugate gradient methods for systems of monotone nonlinear equations (2020)
  6. Waziri, M. Y.; Ahmed, K.; Sabi’u, J.: A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations (2020)
  7. Gao, Peiting; He, Chuanjiang; Liu, Yang: An adaptive family of projection methods for constrained monotone nonlinear equations with applications (2019)
  8. Li, Yufei; Liu, Zexian; Liu, Hongwei: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization (2019)
  9. Dong, XiaoLiang; Han, Deren; Dai, Zhifeng; Li, Lixiang; Zhu, Jianguang: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition (2018)
  10. Yao, Shengwei; Ning, Liangshuo: An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix (2018)
  11. Andrei, Neculai: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update (2017)