IMFIL
Implicit filtering is a way to solve bound-constrained optimization problems for which derivative information is not available. Unlike methods that use interpolation to reconstruct the function and its higher derivatives, implicit filtering builds upon coordinate search and then interpolates to get an approximation of the gradient. Implicit Filtering describes the algorithm, its convergence theory and a new MATLAB implementation. It is unique for being the only book in the area of derivative-free or sampling methods to be accompanied by publicly available software. It includes an overview of recent results on optimization of noisy functions, including results that depend on non-smooth analysis and results on the handling of constraints. This book is for graduate students who want to learn about this technology, scientists and engineers who wish to apply the methods to their problems and specialists who will use the ideas and the software from this book in their own research.
Keywords for this software
References in zbMATH (referenced in 40 articles , 1 standard article )
Showing results 1 to 20 of 40.
Sorted by year (- Chakraborty, Suvra Kanti; Panda, Geetanjali: A modified coordinate search method based on axes rotation (2020)
- Diniz-Ehrhardt, M. A.; Ferreira, D. G.; Santos, S. A.: Applying the pattern search implicit filtering algorithm for solving a noisy problem of parameter identification (2020)
- Hare, Warren; Jarry-Bolduc, Gabriel: Calculus identities for generalized simplex gradients: rules and applications (2020)
- Hare, Warren; Jarry-Bolduc, Gabriel: A deterministic algorithm to compute the cosine measure of a finite positive spanning set (2020)
- Hare, Warren; Planiden, Chayne; Sagastizábal, Claudia: A derivative-free (\mathcalV\mathcalU)-algorithm for convex finite-max problems (2020)
- Sauk, Benjamin; Ploskas, Nikolaos; Sahinidis, Nikolaos: GPU parameter tuning for tall and skinny dense linear least squares problems (2020)
- Xie, Yuchen; Byrd, Richard H.; Nocedal, Jorge: Analysis of the BFGS method with errors (2020)
- Berahas, Albert S.; Byrd, Richard H.; Nocedal, Jorge: Derivative-free optimization of noisy functions via quasi-Newton methods (2019)
- Coope, Ian; Tappenden, Rachael: Efficient calculation of regular simplex gradients (2019)
- Gratton, S.; Royer, C. W.; Vicente, L. N.; Zhang, Z.: Direct search based on probabilistic feasible descent for bound and linearly constrained problems (2019)
- Larson, Jeffrey; Menickelly, Matt; Wild, Stefan M.: Derivative-free optimization methods (2019)
- Müller, Juliane; Day, Marcus: Surrogate optimization of computationally expensive black-box problems with hidden constraints (2019)
- Chen, Xiaojun; Kelley, C. T.; Xu, Fengmin; Zhang, Zaikun: A smoothing direct search method for Monte Carlo-based bound constrained composite nonsmooth optimization (2018)
- Maggiar, Alvaro; Wächter, Andreas; Dolinskaya, Irina S.; Staum, Jeremy: A derivative-free trust-region algorithm for the optimization of functions smoothed via Gaussian convolution using adaptive multiple importance sampling (2018)
- Ribeiro, Ademir A.; Sachine, Mael; Santos, Sandra A.: On the approximate solutions of augmented subproblems within sequential methods for nonlinear programming (2018)
- Loreto, Milagros; Aponte, Hugo; Cores, Debora; Raydan, Marcos: Nonsmooth spectral gradient methods for unconstrained optimization (2017)
- Chen, Xiaojun; Kelley, C. T.: Optimization with hidden constraints and embedded Monte Carlo computations (2016)
- Regis, Rommel G.: On the properties of positive spanning sets and positive bases (2016)
- Stich, S. U.; Müller, C. L.; Gärtner, B.: Variable metric random pursuit (2016)
- Grippo, L.; Rinaldi, F.: A class of derivative-free nonmonotone optimization algorithms employing coordinate rotations and gradient approximations (2015)