IMFIL
Implicit filtering is a way to solve bound-constrained optimization problems for which derivative information is not available. Unlike methods that use interpolation to reconstruct the function and its higher derivatives, implicit filtering builds upon coordinate search and then interpolates to get an approximation of the gradient. Implicit Filtering describes the algorithm, its convergence theory and a new MATLAB implementation. It is unique for being the only book in the area of derivative-free or sampling methods to be accompanied by publicly available software. It includes an overview of recent results on optimization of noisy functions, including results that depend on non-smooth analysis and results on the handling of constraints. This book is for graduate students who want to learn about this technology, scientists and engineers who wish to apply the methods to their problems and specialists who will use the ideas and the software from this book in their own research.
Keywords for this software
References in zbMATH (referenced in 22 articles , 1 standard article )
Showing results 1 to 20 of 22.
Sorted by year (- Loreto, Milagros; Aponte, Hugo; Cores, Debora; Raydan, Marcos: Nonsmooth spectral gradient methods for unconstrained optimization (2017)
- Chen, Xiaojun; Kelley, C.T.: Optimization with hidden constraints and embedded Monte Carlo computations (2016)
- Regis, Rommel G.: On the properties of positive spanning sets and positive bases (2016)
- Stich, S.U.; Müller, C.L.; Gärtner, B.: Variable metric random pursuit (2016)
- Grippo, L.; Rinaldi, F.: A class of derivative-free nonmonotone optimization algorithms employing coordinate rotations and gradient approximations (2015)
- Regis, Rommel G.: The calculus of simplex gradients (2015)
- Cornelio, Anastasia; Piccolomini, Elena Loli; Nagy, James G.: Constrained numerical optimization methods for blind deconvolution (2014)
- Ganapathy, Kanthaswamy; Jerome, Jovitha: Control of dead-time systems using derivative free local search guided population based incremental learning algorithms (2014)
- Hare, W.; Nutini, J.: A derivative-free approximate gradient sampling algorithm for finite minimax problems (2013)
- Olufsen, Mette S.; Ottesen, Johnny T.: A practical approach to parameter estimation applied to model predicting heart rate regulation (2013)
- Rios, Luis Miguel; Sahinidis, Nikolaos V.: Derivative-free optimization: a review of algorithms and comparison of software implementations (2013)
- Hagstrom, Thomas: High-order radiation boundary conditions for stratified media and curvilinear coordinates (2012)
- Kelley, C. T.: Implicit filtering (2011)
- Conn, A.R.; Scheinberg, K.; Vicente, Luís N.: Geometry of interpolation sets in derivative free optimization (2008)
- Custódio, A.L.; Vicente, L.N.: Using sampling and simplex derivatives in pattern search methods (2007)
- Vanden Berghen, Frank; Bersini, Hugues: CONDOR, a new parallel, constrained extension of Powell’s UOBYQA algorithm: Experimental results and comparison with the DFO algorithm (2005)
- Arnold, Dirk V.; Beyer, Hans-Georg: A comparison of evolution strategies with other direct search methods in the presence of noise (2003)
- Kolda, Tamara G.; Lewis, Robert Michael; Torczon, Virginia: Optimization by direct search: New perspectives on some Classical and modern methods (2003)
- Hintermüller, M.: Solving nonlinear programming problems with noisy function values and noisy gradients (2002)
- Carter, R. G.; Gablonsky, J. M.; Patrick, A.; Kelley, C. T.; Eslinger, O. J.: Algorithms for noisy problems in gas transmission pipeline optimization (2001)