CUTEr is a versatile testing environment for optimization and linear algebra solvers. The package contains a collection of test problems, along with Fortran 77, Fortran 90/95 and Matlab tools intended to help developers design, compare and improve new and existing solvers. The test problems provided are written in so-called Standard Input Format (SIF). A decoder to convert from this format into well-defined Fortran 77 and data files is available as a separate package. Once translated, these files may be manipulated to provide tools suitable for testing optimization packages. Ready-to-use interfaces to existing packages, such as MINOS, SNOPT, filterSQP, Knitro, and more, are provided. See the interfaces section for a complete list.

References in zbMATH (referenced in 418 articles , 1 standard article )

Showing results 1 to 20 of 418.
Sorted by year (citations)

1 2 3 ... 19 20 21 next

  1. Arzani, F.; Peyghami, M.Reza: A new nonmonotone filter Barzilai-Borwein method for solving unconstrained optimization problems (2016)
  2. Babaie-Kafaki, Saman: A modified scaling parameter for the memoryless BFGS updating formula (2016)
  3. Babaie-Kafaki, Saman; Ghanbari, Reza: A descent hybrid modification of the Polak-Ribière-Polyak conjugate gradient method (2016)
  4. Babaie-Kafaki, Saman; Ghanbari, Reza: Descent symmetrization of the Dai-Liao conjugate gradient method (2016)
  5. Byrd, Richard H.; Chin, Gillian M.; Nocedal, Jorge; Oztoprak, Figen: A family of second-order methods for convex $\ell _1$-regularized optimization (2016)
  6. Dodangeh, M.; Vicente, L.N.: Worst case complexity of direct search under convexity (2016)
  7. Du, Xuewu; Zhang, Peng; Ma, Wenya: Some modified conjugate gradient methods for unconstrained optimization (2016)
  8. Fatemi, M.: An optimal parameter for dai-liao family of conjugate gradient methods (2016)
  9. Fatemi, Masoud: A new efficient conjugate gradient method for unconstrained optimization (2016)
  10. Forsgren, Anders; Gill, Philip E.; Wong, Elizabeth: Primal and dual active-set methods for convex quadratic programming (2016)
  11. Garmanjani, R.; Júdice, D.; Vicente, L.N.: Trust-region methods without using derivatives: worst case complexity and the nonsmooth case (2016)
  12. Gower, R.M.; Gower, A.L.: Higher-order reverse automatic differentiation with emphasis on the third-order (2016)
  13. Qiu, Songqiang; Chen, Zhongwen: A globally convergent penalty-free method for optimization with equality constraints and simple bounds (2016)
  14. Salleh, Zabidin; Alhawarat, Ahmad: An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property (2016)
  15. Shen, Chungen; Zhang, Lei-Hong; Liu, Wei: A stabilized filter SQP algorithm for nonlinear programming (2016)
  16. Wang, X.Y.; Li, S.J.; Kou, Xi Peng: A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints (2016)
  17. Yu, Zhensheng; Gan, Xinyue: Global and local R-linear convergence of a spectral projected gradient method for convex optimization with singular solution (2016)
  18. Zhang, Yang; Dan, Bin: An efficient adaptive scaling parameter for the spectral conjugate gradient method (2016)
  19. Zhu, Xiaojing: On a globally convergent trust region algorithm with infeasibility control for equality constrained optimization (2016)
  20. Al-Baali, Mehiddin; Narushima, Yasushi; Yabe, Hiroshi: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization (2015)

1 2 3 ... 19 20 21 next