CUTEr

CUTEr is a versatile testing environment for optimization and linear algebra solvers. The package contains a collection of test problems, along with Fortran 77, Fortran 90/95 and Matlab tools intended to help developers design, compare and improve new and existing solvers. The test problems provided are written in so-called Standard Input Format (SIF). A decoder to convert from this format into well-defined Fortran 77 and data files is available as a separate package. Once translated, these files may be manipulated to provide tools suitable for testing optimization packages. Ready-to-use interfaces to existing packages, such as MINOS, SNOPT, filterSQP, Knitro, and more, are provided. See the interfaces section for a complete list.


References in zbMATH (referenced in 545 articles , 1 standard article )

Showing results 1 to 20 of 545.
Sorted by year (citations)

1 2 3 ... 26 27 28 next

  1. Dehghani, Razieh; Bidabadi, Narges; Fahs, Hassan; Hosseini, Mohammad Mehdi: A conjugate gradient method based on a modified secant relation for unconstrained optimization (2020)
  2. Gill, Philip E.; Kungurtsev, Vyacheslav; Robinson, Daniel P.: A shifted primal-dual penalty-barrier method for nonlinear optimization (2020)
  3. Li, Min: A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method (2020)
  4. Liu, J. K.; Zhao, Y. X.; Wu, X. L.: Some three-term conjugate gradient methods with the new direction structure (2020)
  5. Liu, Xin-Wei; Dai, Yu-Hong: A globally convergent primal-dual interior-point relaxation method for nonlinear programs (2020)
  6. Liu, Zexian; Liu, Hongwei; Dai, Yu-Hong: An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization (2020)
  7. Aminifard, Z.; Babaie-Kafaki, S.: Matrix analyses on the Dai-Liao conjugate gradient method (2019)
  8. Aminifard, Zohre; Babaie-Kafaki, Saman: An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix (2019)
  9. Aminifard, Zohre; Babaie-Kafaki, Saman: A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions (2019)
  10. Amini, Keyvan; Faramarzi, Parvaneh; Pirfalah, Nasrin: A modified Hestenes-Stiefel conjugate gradient method with an optimal property (2019)
  11. Andrei, Neculai: A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization (2019)
  12. Armand, Paul; Tran, Ngoc Nguyen: An augmented Lagrangian method for equality constrained optimization with rapid infeasibility detection capabilities (2019)
  13. Audet, Charles; Le Digabel, Sébastien; Tribes, Christophe: The mesh adaptive direct search algorithm for granular and discrete variables (2019)
  14. Babaie-Kafaki, Saman; Aminifard, Zohre: Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length (2019)
  15. Boggs, Paul T.; Byrd, Richard H.: Adaptive, limited-memory BFGS algorithms for unconstrained optimization (2019)
  16. Brust, Johannes; Burdakov, Oleg; Erway, Jennifer B.; Marcia, Roummel F.: A dense initialization for limited-memory quasi-Newton methods (2019)
  17. Dong, Wen-Li; Li, Xing; Peng, Zheng: A simulated annealing-based Barzilai-Borwein gradient method for unconstrained optimization problems (2019)
  18. Faramarzi, Parvaneh; Amini, Keyvan: A modified spectral conjugate gradient method with global convergence (2019)
  19. Faramarzi, Parvaneh; Amini, Keyvan: A scaled three-term conjugate gradient method for large-scale unconstrained optimization problem (2019)
  20. Huang, Na; Ma, Chang-Feng: Spectral analysis of the preconditioned system for the (3 \times3) block saddle point problem (2019)

1 2 3 ... 26 27 28 next