Implicit filtering is a way to solve bound-constrained optimization problems for which derivative information is not available. Unlike methods that use interpolation to reconstruct the function and its higher derivatives, implicit filtering builds upon coordinate search and then interpolates to get an approximation of the gradient. Implicit Filtering describes the algorithm, its convergence theory and a new MATLAB implementation. It is unique for being the only book in the area of derivative-free or sampling methods to be accompanied by publicly available software. It includes an overview of recent results on optimization of noisy functions, including results that depend on non-smooth analysis and results on the handling of constraints. This book is for graduate students who want to learn about this technology, scientists and engineers who wish to apply the methods to their problems and specialists who will use the ideas and the software from this book in their own research.

References in zbMATH (referenced in 44 articles , 1 standard article )

Showing results 1 to 20 of 44.
Sorted by year (citations)

1 2 3 next

  1. Shi, Hao-Jun M.; Xie, Yuchen; Byrd, Richard; Nocedal, Jorge: A noise-tolerant quasi-Newton algorithm for unconstrained optimization (2022)
  2. Berahas, A. S.; Cao, L.; Scheinberg, K.: Global convergence rate analysis of a generic line search algorithm with noise (2021)
  3. Dedoncker, Sander; Desmet, Wim; Naets, Frank: Generating set search using simplex gradients for bound-constrained black-box optimization (2021)
  4. Gal, Raviv; Haber, Eldad; Irwin, Brian; Saleh, Bilal; Ziv, Avi: How to catch a lion in the desert: on the solution of the coverage directed generation (CDG) problem (2021)
  5. Chakraborty, Suvra Kanti; Panda, Geetanjali: A modified coordinate search method based on axes rotation (2020)
  6. Diniz-Ehrhardt, M. A.; Ferreira, D. G.; Santos, S. A.: Applying the pattern search implicit filtering algorithm for solving a noisy problem of parameter identification (2020)
  7. Hare, Warren; Jarry-Bolduc, Gabriel: A deterministic algorithm to compute the cosine measure of a finite positive spanning set (2020)
  8. Hare, Warren; Jarry-Bolduc, Gabriel: Calculus identities for generalized simplex gradients: rules and applications (2020)
  9. Hare, Warren; Planiden, Chayne; Sagastizábal, Claudia: A derivative-free (\mathcalV\mathcalU)-algorithm for convex finite-max problems (2020)
  10. Sauk, Benjamin; Ploskas, Nikolaos; Sahinidis, Nikolaos: GPU parameter tuning for tall and skinny dense linear least squares problems (2020)
  11. Xie, Yuchen; Byrd, Richard H.; Nocedal, Jorge: Analysis of the BFGS method with errors (2020)
  12. Berahas, Albert S.; Byrd, Richard H.; Nocedal, Jorge: Derivative-free optimization of noisy functions via quasi-Newton methods (2019)
  13. Coope, Ian; Tappenden, Rachael: Efficient calculation of regular simplex gradients (2019)
  14. Gratton, S.; Royer, C. W.; Vicente, L. N.; Zhang, Z.: Direct search based on probabilistic feasible descent for bound and linearly constrained problems (2019)
  15. Larson, Jeffrey; Menickelly, Matt; Wild, Stefan M.: Derivative-free optimization methods (2019)
  16. Müller, Juliane; Day, Marcus: Surrogate optimization of computationally expensive black-box problems with hidden constraints (2019)
  17. Chen, Xiaojun; Kelley, C. T.; Xu, Fengmin; Zhang, Zaikun: A smoothing direct search method for Monte Carlo-based bound constrained composite nonsmooth optimization (2018)
  18. Maggiar, Alvaro; Wächter, Andreas; Dolinskaya, Irina S.; Staum, Jeremy: A derivative-free trust-region algorithm for the optimization of functions smoothed via Gaussian convolution using adaptive multiple importance sampling (2018)
  19. Ribeiro, Ademir A.; Sachine, Mael; Santos, Sandra A.: On the approximate solutions of augmented subproblems within sequential methods for nonlinear programming (2018)
  20. Loreto, Milagros; Aponte, Hugo; Cores, Debora; Raydan, Marcos: Nonsmooth spectral gradient methods for unconstrained optimization (2017)

1 2 3 next