CUTEr

CUTEr is a versatile testing environment for optimization and linear algebra solvers. The package contains a collection of test problems, along with Fortran 77, Fortran 90/95 and Matlab tools intended to help developers design, compare and improve new and existing solvers. The test problems provided are written in so-called Standard Input Format (SIF). A decoder to convert from this format into well-defined Fortran 77 and data files is available as a separate package. Once translated, these files may be manipulated to provide tools suitable for testing optimization packages. Ready-to-use interfaces to existing packages, such as MINOS, SNOPT, filterSQP, Knitro, and more, are provided. See the interfaces section for a complete list.


References in zbMATH (referenced in 555 articles , 1 standard article )

Showing results 1 to 20 of 555.
Sorted by year (citations)

1 2 3 ... 26 27 28 next

  1. Andrei, Neculai: New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method (2020)
  2. Andrei, Neculai: Diagonal approximation of the Hessian by finite differences for unconstrained optimization (2020)
  3. Andrei, Neculai: A double parameter self-scaling memoryless BFGS method for unconstrained optimization (2020)
  4. Babaie-Kafaki, Saman: A modified scaled memoryless symmetric rank-one method (2020)
  5. Bartholomew-Biggs, Michael; Beddiaf, Salah; Christianson, Bruce: A comparison of methods for traversing regions of non-convexity in optimization problems (2020)
  6. Dai, Yu-Hong; Liu, Xin-Wei; Sun, Jie: A primal-dual interior-point method capable of rapidly detecting infeasibility for nonlinear programs (2020)
  7. Dehghani, Razieh; Bidabadi, Narges; Fahs, Hassan; Hosseini, Mohammad Mehdi: A conjugate gradient method based on a modified secant relation for unconstrained optimization (2020)
  8. Estrin, Ron; Friedlander, Michael P.; Orban, Dominique; Saunders, Michael A.: Implementing a smooth exact penalty function for general constrained nonlinear optimization (2020)
  9. Gill, Philip E.; Kungurtsev, Vyacheslav; Robinson, Daniel P.: A shifted primal-dual penalty-barrier method for nonlinear optimization (2020)
  10. Li, Min: A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method (2020)
  11. Liu, J. K.; Zhao, Y. X.; Wu, X. L.: Some three-term conjugate gradient methods with the new direction structure (2020)
  12. Liu, Meixing; Ma, Guodong; Yin, Jianghua: Two new conjugate gradient methods for unconstrained optimization (2020)
  13. Liu, Xin-Wei; Dai, Yu-Hong: A globally convergent primal-dual interior-point relaxation method for nonlinear programs (2020)
  14. Liu, Zexian; Liu, Hongwei; Dai, Yu-Hong: An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization (2020)
  15. Ou, Yigui; Lin, Haichan: A class of accelerated conjugate-gradient-like methods based on a modified secant equation (2020)
  16. Sellami, Badreddine; Chiheb Eddine Sellami, Mohamed: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Wolfe line search (2020)
  17. Aminifard, Z.; Babaie-Kafaki, S.: Matrix analyses on the Dai-Liao conjugate gradient method (2019)
  18. Aminifard, Zohre; Babaie-Kafaki, Saman: An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix (2019)
  19. Aminifard, Zohre; Babaie-Kafaki, Saman: A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions (2019)
  20. Amini, Keyvan; Faramarzi, Parvaneh; Pirfalah, Nasrin: A modified Hestenes-Stiefel conjugate gradient method with an optimal property (2019)

1 2 3 ... 26 27 28 next