CUTE

CUTE: Constrained and unconstrained testing environment. The purpose of this article is to discuss the scope and functionality of a versatile environment for testing small- and large-scale nonlinear optimization algorithms. Although many of these facilities were originally produced by the authors in conjunction with the software package LANCELOT, we believe that they will be useful in their own right and should be available to researchers for their development of optimization software. The tools can be obtained by anonymous ftp from a number of sources and may, in many cases, be installed automatically. The scope of a major collection of test problems written in the standard input format (SIF) used by the LANCELOT software package is described. Recognizing that most software was not written with the SIF in mind, we provide tools to assist in building an interface between this input format and other optimization packages. These tools provide a link between the SIF and a number of existing packages, including MINOS and OSL. Additionally, as each problem includes a specific classification that is designed to be useful in identifying particular classes of problems, facilities are provided to build and manage a database of this information. There is a Unix and C shell bias to many of the descriptions in the article, since, for the sake of simplicity, we do not illustrate everything in its fullest generality. We trust that the majority of potential users are sufficiently familiar with Unix that these examples will not lead to undue confusion.

This software is also peer reviewed by journal TOMS.


References in zbMATH (referenced in 214 articles , 1 standard article )

Showing results 1 to 20 of 214.
Sorted by year (citations)

1 2 3 ... 9 10 11 next

  1. Andrei, Neculai: New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method (2020)
  2. Andrei, Neculai: Diagonal approximation of the Hessian by finite differences for unconstrained optimization (2020)
  3. Andrei, Neculai: A double parameter self-scaling memoryless BFGS method for unconstrained optimization (2020)
  4. Dai, Yu-Hong; Liu, Xin-Wei; Sun, Jie: A primal-dual interior-point method capable of rapidly detecting infeasibility for nonlinear programs (2020)
  5. Gill, Philip E.; Kungurtsev, Vyacheslav; Robinson, Daniel P.: A shifted primal-dual penalty-barrier method for nonlinear optimization (2020)
  6. Li, Min: A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method (2020)
  7. Liu, J. K.; Zhao, Y. X.; Wu, X. L.: Some three-term conjugate gradient methods with the new direction structure (2020)
  8. Liu, Meixing; Ma, Guodong; Yin, Jianghua: Two new conjugate gradient methods for unconstrained optimization (2020)
  9. Liu, Xin-Wei; Dai, Yu-Hong: A globally convergent primal-dual interior-point relaxation method for nonlinear programs (2020)
  10. Ou, Yigui; Lin, Haichan: A class of accelerated conjugate-gradient-like methods based on a modified secant equation (2020)
  11. Sellami, Badreddine; Chiheb Eddine Sellami, Mohamed: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Wolfe line search (2020)
  12. Amini, Keyvan; Faramarzi, Parvaneh; Pirfalah, Nasrin: A modified Hestenes-Stiefel conjugate gradient method with an optimal property (2019)
  13. Andrei, Neculai: A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization (2019)
  14. Boggs, Paul T.; Byrd, Richard H.: Adaptive, limited-memory BFGS algorithms for unconstrained optimization (2019)
  15. Dong, Wen-Li; Li, Xing; Peng, Zheng: A simulated annealing-based Barzilai-Borwein gradient method for unconstrained optimization problems (2019)
  16. Faramarzi, Parvaneh; Amini, Keyvan: A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem (2019)
  17. Faramarzi, Parvaneh; Amini, Keyvan: A modified spectral conjugate gradient method with global convergence (2019)
  18. Liu, J. K.; Feng, Y. M.; Zou, L. M.: A spectral conjugate gradient method for solving large-scale unconstrained optimization (2019)
  19. Sim, Hong Seng; Leong, Wah June; Chen, Chuei Yee: Gradient method with multiple damping for large-scale unconstrained optimization (2019)
  20. Vlček, Jan; Lukšan, Ladislav: A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions (2019)

1 2 3 ... 9 10 11 next