CUTEr is a versatile testing environment for optimization and linear algebra solvers. The package contains a collection of test problems, along with Fortran 77, Fortran 90/95 and Matlab tools intended to help developers design, compare and improve new and existing solvers. The test problems provided are written in so-called Standard Input Format (SIF). A decoder to convert from this format into well-defined Fortran 77 and data files is available as a separate package. Once translated, these files may be manipulated to provide tools suitable for testing optimization packages. Ready-to-use interfaces to existing packages, such as MINOS, SNOPT, filterSQP, Knitro, and more, are provided. See the interfaces section for a complete list.

References in zbMATH (referenced in 429 articles , 1 standard article )

Showing results 1 to 20 of 429.
Sorted by year (citations)

1 2 3 ... 20 21 22 next

  1. Dong, Xiao Liang; Li, Wei Jun; He, Yu Bo: Some modified Yabe-Takano conjugate gradient methods with sufficient descent condition (2017)
  2. Huang, Yuanyuan; Liu, Changhe: Dai-Kou type conjugate gradient methods with a line search only using gradient (2017)
  3. Sala, Ramses; Baldanzini, Niccolò; Pierini, Marco: Global optimization test problems based on random field composition (2017)
  4. Wan, Wei; Biegler, Lorenz T.: Structured regularization for barrier NLP solvers (2017)
  5. Arzani, F.; Peyghami, M.Reza: A new nonmonotone filter Barzilai-Borwein method for solving unconstrained optimization problems (2016)
  6. Babaie-Kafaki, Saman: A modified scaling parameter for the memoryless BFGS updating formula (2016)
  7. Babaie-Kafaki, Saman; Ghanbari, Reza: A descent hybrid modification of the Polak-Ribière-Polyak conjugate gradient method (2016)
  8. Babaie-Kafaki, Saman; Ghanbari, Reza: Descent symmetrization of the Dai-Liao conjugate gradient method (2016)
  9. Byrd, Richard H.; Chin, Gillian M.; Nocedal, Jorge; Oztoprak, Figen: A family of second-order methods for convex $\ell _1$-regularized optimization (2016)
  10. Dodangeh, M.; Vicente, L.N.: Worst case complexity of direct search under convexity (2016)
  11. Du, Xuewu; Zhang, Peng; Ma, Wenya: Some modified conjugate gradient methods for unconstrained optimization (2016)
  12. Fatemi, M.: An optimal parameter for dai-liao family of conjugate gradient methods (2016)
  13. Fatemi, Masoud: A new efficient conjugate gradient method for unconstrained optimization (2016)
  14. Forsgren, Anders; Gill, Philip E.; Wong, Elizabeth: Primal and dual active-set methods for convex quadratic programming (2016)
  15. Garmanjani, R.; Júdice, D.; Vicente, L.N.: Trust-region methods without using derivatives: worst case complexity and the nonsmooth case (2016)
  16. Gower, R.M.; Gower, A.L.: Higher-order reverse automatic differentiation with emphasis on the third-order (2016)
  17. Qiu, Songqiang; Chen, Zhongwen: A globally convergent penalty-free method for optimization with equality constraints and simple bounds (2016)
  18. Salleh, Zabidin; Alhawarat, Ahmad: An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property (2016)
  19. Sellami, Badreddine; Chaib, Yacine: New conjugate gradient method for unconstrained optimization (2016)
  20. Shen, Chungen; Zhang, Lei-Hong; Liu, Wei: A stabilized filter SQP algorithm for nonlinear programming (2016)

1 2 3 ... 20 21 22 next