CMA-ES

CMA-ES stands for Covariance Matrix Adaptation Evolution Strategy. Evolution strategies (ES) are stochastic, derivative-free methods for numerical optimization of non-linear or non-convex continuous optimization problems. They belong to the class of evolutionary algorithms and evolutionary computation. An evolutionary algorithm is broadly based on the principle of biological evolution, namely the repeated interplay of variation (via mutation and recombination) and selection: in each generation (iteration) new individuals (candidate solutions, denoted as x) are generated by variation, usually in a stochastic way, and then some individuals are selected for the next generation based on their fitness or objective function value f(x). Like this, over the generation sequence, individuals with better and better f-values are generated. In an evolution strategy, new candidate solutions are sampled according to a multivariate normal distribution in the mathbb{R}^n. Pairwise dependencies between the variables in this distribution are represented by a covariance matrix. The covariance matrix adaptation (CMA) is a method to update the covariance matrix of this distribution. This is particularly useful, if the function f is ill-conditioned. Adaptation of the covariance matrix amounts to learning a second order model of the underlying objective function similar to the approximation of the inverse Hessian matrix in the Quasi-Newton method in classical optimization. In contrast to most classical methods, fewer assumptions on the nature of the underlying objective function are made. Only the ranking between candidate solutions is exploited for learning the sample distribution and neither derivatives nor even the function values themselves are required by the method. (Source: http://plato.asu.edu)


References in zbMATH (referenced in 64 articles )

Showing results 1 to 20 of 64.
Sorted by year (citations)

1 2 3 4 next

  1. Amaran, Satyajith; Sahinidis, Nikolaos V.; Sharda, Bikram; Bury, Scott J.: Simulation optimization: a review of algorithms and applications (2016)
  2. Balesdent, Mathieu; Morio, Jér^ome; Brevault, Loïc: Rare event probability estimation in the presence of epistemic uncertainty on input probability distribution parameters (2016)
  3. Durantin, Cédric; Marzat, Julien; Balesdent, Mathieu: Analysis of multi-objective Kriging-based methods for constrained global optimization (2016)
  4. Regis, Rommel G.: On the convergence of adaptive stochastic search methods for constrained and multi-objective black-box optimization (2016)
  5. Gil, Debora; Roche, David; Borràs, Agnés; Giraldo, Jesús: Terminating evolutionary algorithms at their steady state (2015)
  6. Schmidhuber, Jürgen: Deep learning in neural networks: an overview (2015)
  7. Amaran, Satyajith; Sahinidis, Nikolaos V.; Sharda, Bikram; Bury, Scott J.: Simulation optimization: a review of algorithms and applications (2014)
  8. Caraffini, Fabio; Neri, Ferrante; Picinali, Lorenzo: An analysis on separability for memetic computing automatic design (2014)
  9. Gong, Wenyin; Cai, Zhihua; Liang, Dingwen: Engineering optimization by means of an improved constrained differential evolution (2014)
  10. Hansen, Nikolaus; Auger, Anne: Principled design of continuous stochastic search: from theory to practice (2014)
  11. Lacroix, Benjamin; Molina, Daniel; Herrera, Francisco: Region based memetic algorithm for real-parameter optimisation (2014)
  12. Liao, Tianjun; Stützle, Thomas; Montes de Oca, Marco A.; Dorigo, Marco: A unified ant colony optimization algorithm for continuous optimization (2014)
  13. Mereuta, A.; Aupetit, S.; Monmarché, N.; Slimane, M.: Web page textual color contrast compensation for CVD users using optimization methods (2014)
  14. Billy, Frédérique; Clairambault, Jean; Fercoq, Olivier: Optimisation of cancer drug treatments using cell population dynamics (2013)
  15. Caraffini, Fabio; Neri, Ferrante; Iacca, Giovanni; Mol, Aran: Parallel memetic structures (2013)
  16. Caraffini, Fabio; Neri, Ferrante; Passow, Benjamin N.; Iacca, Giovanni: Re-sampled inheritance search: high performance despite the simplicity (2013)
  17. Fix, Jérémy: Template based black-box optimization of dynamic neural fields (2013)
  18. Janusevskis, Janis; Le Riche, Rodolphe: Simultaneous Kriging-based estimation and optimization of mean response (2013)
  19. Jordt, Andreas; Koch, Reinhard: Direct model-based tracking of 3D object deformations in depth and color video (2013)
  20. Kunakote, Tawatchai; Bureerat, Sujin: Surrogate-assisted multiobjective evolutionary algorithms for structural shape and sizing optimisation (2013)

1 2 3 4 next


Further publications can be found at: https://www.lri.fr/~hansen/publications.html#hansenReview2006