CUTEr is a versatile testing environment for optimization and linear algebra solvers. The package contains a collection of test problems, along with Fortran 77, Fortran 90/95 and Matlab tools intended to help developers design, compare and improve new and existing solvers. The test problems provided are written in so-called Standard Input Format (SIF). A decoder to convert from this format into well-defined Fortran 77 and data files is available as a separate package. Once translated, these files may be manipulated to provide tools suitable for testing optimization packages. Ready-to-use interfaces to existing packages, such as MINOS, SNOPT, filterSQP, Knitro, and more, are provided. See the interfaces section for a complete list.

References in zbMATH (referenced in 494 articles , 1 standard article )

Showing results 1 to 20 of 494.
Sorted by year (citations)

1 2 3 ... 23 24 25 next

  1. Jiang, Xianzhen; Jian, Jinbao: Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search (2019)
  2. Vlček, Jan; Lukšan, Ladislav: A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions (2019)
  3. Zhou, W.; Akrotirianakis, I. G.; Yektamaram, S.; Griffin, J. D.: A matrix-free line-search algorithm for nonconvex optimization (2019)
  4. Ali, M. Montaz; Oliphant, Terry-Leigh: A trajectory-based method for constrained nonlinear optimization problems (2018)
  5. Amaioua, Nadir; Audet, Charles; Conn, Andrew R.; Le Digabel, Sébastien: Efficient solution of quadratically constrained quadratic subproblems within the mesh adaptive direct search algorithm (2018)
  6. Andrei, Neculai: A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues (2018)
  7. Andrei, Neculai: A double parameter scaled BFGS method for unconstrained optimization (2018)
  8. Audet, Charles; Tribes, Christophe: Mesh-based Nelder-Mead algorithm for inequality constrained optimization (2018)
  9. Babaie-Kafaki, Saman; Ghanbari, Reza: Two adaptive Dai-Liao nonlinear conjugate gradient methods (2018)
  10. Babaie-Kafaki, Saman; Ghanbari, Reza: A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique (2018)
  11. Babaie-Kafaki, Saman; Rezaee, Saeed: Two accelerated nonmonotone adaptive trust region line search methods (2018)
  12. Dehghani, Razieh; Bidabadi, Narges; Hosseini, Mohammad Mehdi: A new modified BFGS method for unconstrained optimization problems (2018)
  13. Dong, XiaoLiang; Han, Deren; Dai, Zhifeng; Li, Lixiang; Zhu, Jianguang: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition (2018)
  14. Huyer, Waltraud; Neumaier, Arnold: MINQ8: general definite and bound constrained indefinite quadratic programming (2018)
  15. Lee, M. S.; Goh, B. S.; Harno, H. G.; Lim, K. H.: On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints (2018)
  16. Li, Dan; Zhu, Detong: An affine scaling interior trust-region method combining with line search filter technique for optimization subject to bounds on variables (2018)
  17. Li, Min: A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method (2018)
  18. Li, Min: A modified Hestense-Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method (2018)
  19. Liu, J. K.; Feng, Y. M.; Zou, L. M.: Some three-term conjugate gradient methods with the inexact line search condition (2018)
  20. Livieris, Ioannis E.; Tampakas, Vassilis; Pintelas, Panagiotis: A descent hybrid conjugate gradient method based on the memoryless BFGS update (2018)

1 2 3 ... 23 24 25 next