CUTE: Constrained and unconstrained testing environment. The purpose of this article is to discuss the scope and functionality of a versatile environment for testing small- and large-scale nonlinear optimization algorithms. Although many of these facilities were originally produced by the authors in conjunction with the software package LANCELOT, we believe that they will be useful in their own right and should be available to researchers for their development of optimization software. The tools can be obtained by anonymous ftp from a number of sources and may, in many cases, be installed automatically. The scope of a major collection of test problems written in the standard input format (SIF) used by the LANCELOT software package is described. Recognizing that most software was not written with the SIF in mind, we provide tools to assist in building an interface between this input format and other optimization packages. These tools provide a link between the SIF and a number of existing packages, including MINOS and OSL. Additionally, as each problem includes a specific classification that is designed to be useful in identifying particular classes of problems, facilities are provided to build and manage a database of this information. There is a Unix and C shell bias to many of the descriptions in the article, since, for the sake of simplicity, we do not illustrate everything in its fullest generality. We trust that the majority of potential users are sufficiently familiar with Unix that these examples will not lead to undue confusion.

This software is also peer reviewed by journal TOMS.

References in zbMATH (referenced in 226 articles , 1 standard article )

Showing results 1 to 20 of 226.
Sorted by year (citations)

1 2 3 ... 10 11 12 next

  1. Andrei, Neculai: A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization (2021)
  2. Berahas, Albert S.; Curtis, Frank E.; Robinson, Daniel; Zhou, Baoyu: Sequential quadratic optimization for nonlinear equality constrained stochastic optimization (2021)
  3. Faramarzi, Parvaneh; Amini, Keyvan: A spectral three-term Hestenes-Stiefel conjugate gradient method (2021)
  4. Ivanov, Branislav; Stanimirović, Predrag S.; Shaini, Bilall I.; Ahmad, Hijaz; Wang, Miao-Kun: A novel value for the parameter in the Dai-Liao-type conjugate gradient method (2021)
  5. Leong, Wah June; Enshaei, Sharareh; Kek, Sie Long: Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm (2021)
  6. Vlček, Jan; Lukšan, Ladislav: Two limited-memory optimization methods with minimum violation of the previous secant conditions (2021)
  7. Andrei, Neculai: Diagonal approximation of the Hessian by finite differences for unconstrained optimization (2020)
  8. Andrei, Neculai: New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method (2020)
  9. Andrei, Neculai: A double parameter self-scaling memoryless BFGS method for unconstrained optimization (2020)
  10. Dai, Yu-Hong; Liu, Xin-Wei; Sun, Jie: A primal-dual interior-point method capable of rapidly detecting infeasibility for nonlinear programs (2020)
  11. Gill, Philip E.; Kungurtsev, Vyacheslav; Robinson, Daniel P.: A shifted primal-dual penalty-barrier method for nonlinear optimization (2020)
  12. Li, Min: A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method (2020)
  13. Li, Pengyuan; Wang, Zhan; Luo, Dan; Pham, Hongtruong: Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization (2020)
  14. Liu, J. K.; Zhao, Y. X.; Wu, X. L.: Some three-term conjugate gradient methods with the new direction structure (2020)
  15. Liu, Meixing; Ma, Guodong; Yin, Jianghua: Two new conjugate gradient methods for unconstrained optimization (2020)
  16. Liu, Xin-Wei; Dai, Yu-Hong: A globally convergent primal-dual interior-point relaxation method for nonlinear programs (2020)
  17. Ou, Yigui; Lin, Haichan: A class of accelerated conjugate-gradient-like methods based on a modified secant equation (2020)
  18. Sellami, Badreddine; Chiheb Eddine Sellami, Mohamed: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Wolfe line search (2020)
  19. Stanimirović, Predrag S.; Ivanov, Branislav; Ma, Haifeng; Mosić, Dijana: A survey of gradient methods for solving nonlinear optimization (2020)
  20. Tang, Chunming; Li, Shuangyu; Cui, Zengru: Least-squares-based three-term conjugate gradient methods (2020)

1 2 3 ... 10 11 12 next