CUTE: Constrained and unconstrained testing environment. The purpose of this article is to discuss the scope and functionality of a versatile environment for testing small- and large-scale nonlinear optimization algorithms. Although many of these facilities were originally produced by the authors in conjunction with the software package LANCELOT, we believe that they will be useful in their own right and should be available to researchers for their development of optimization software. The tools can be obtained by anonymous ftp from a number of sources and may, in many cases, be installed automatically. The scope of a major collection of test problems written in the standard input format (SIF) used by the LANCELOT software package is described. Recognizing that most software was not written with the SIF in mind, we provide tools to assist in building an interface between this input format and other optimization packages. These tools provide a link between the SIF and a number of existing packages, including MINOS and OSL. Additionally, as each problem includes a specific classification that is designed to be useful in identifying particular classes of problems, facilities are provided to build and manage a database of this information. There is a Unix and C shell bias to many of the descriptions in the article, since, for the sake of simplicity, we do not illustrate everything in its fullest generality. We trust that the majority of potential users are sufficiently familiar with Unix that these examples will not lead to undue confusion.

This software is also peer reviewed by journal TOMS.

References in zbMATH (referenced in 204 articles , 1 standard article )

Showing results 1 to 20 of 204.
Sorted by year (citations)

1 2 3 ... 9 10 11 next

  1. Li, Min: A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method (2020)
  2. Liu, J. K.; Zhao, Y. X.; Wu, X. L.: Some three-term conjugate gradient methods with the new direction structure (2020)
  3. Liu, Xin-Wei; Dai, Yu-Hong: A globally convergent primal-dual interior-point relaxation method for nonlinear programs (2020)
  4. Amini, Keyvan; Faramarzi, Parvaneh; Pirfalah, Nasrin: A modified Hestenes-Stiefel conjugate gradient method with an optimal property (2019)
  5. Andrei, Neculai: A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization (2019)
  6. Boggs, Paul T.; Byrd, Richard H.: Adaptive, limited-memory BFGS algorithms for unconstrained optimization (2019)
  7. Dong, Wen-Li; Li, Xing; Peng, Zheng: A simulated annealing-based Barzilai-Borwein gradient method for unconstrained optimization problems (2019)
  8. Faramarzi, Parvaneh; Amini, Keyvan: A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem (2019)
  9. Faramarzi, Parvaneh; Amini, Keyvan: A modified spectral conjugate gradient method with global convergence (2019)
  10. Sim, Hong Seng; Leong, Wah June; Chen, Chuei Yee: Gradient method with multiple damping for large-scale unconstrained optimization (2019)
  11. Vlček, Jan; Lukšan, Ladislav: A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions (2019)
  12. Vlček, Jan; Lukšan, Ladislav: Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization (2019)
  13. Wang, Guoqiang; Yu, Bo: PAL-Hom method for QP and an application to LP (2019)
  14. Ali, M. Montaz; Oliphant, Terry-Leigh: A trajectory-based method for constrained nonlinear optimization problems (2018)
  15. Dong, XiaoLiang; Han, Deren; Dai, Zhifeng; Li, Lixiang; Zhu, Jianguang: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition (2018)
  16. Lee, M. S.; Goh, B. S.; Harno, H. G.; Lim, K. H.: On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints (2018)
  17. Li, Dan; Zhu, Detong: An affine scaling interior trust-region method combining with line search filter technique for optimization subject to bounds on variables (2018)
  18. Li, Min: A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method (2018)
  19. Li, Min: A modified Hestense-Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method (2018)
  20. Li, Ming; Liu, Hongwei; Liu, Zexian: A new family of conjugate gradient methods for unconstrained optimization (2018)

1 2 3 ... 9 10 11 next