CMA-ES

CMA-ES stands for Covariance Matrix Adaptation Evolution Strategy. Evolution strategies (ES) are stochastic, derivative-free methods for numerical optimization of non-linear or non-convex continuous optimization problems. They belong to the class of evolutionary algorithms and evolutionary computation. An evolutionary algorithm is broadly based on the principle of biological evolution, namely the repeated interplay of variation (via mutation and recombination) and selection: in each generation (iteration) new individuals (candidate solutions, denoted as x) are generated by variation, usually in a stochastic way, and then some individuals are selected for the next generation based on their fitness or objective function value f(x). Like this, over the generation sequence, individuals with better and better f-values are generated. In an evolution strategy, new candidate solutions are sampled according to a multivariate normal distribution in the mathbb{R}^n. Pairwise dependencies between the variables in this distribution are represented by a covariance matrix. The covariance matrix adaptation (CMA) is a method to update the covariance matrix of this distribution. This is particularly useful, if the function f is ill-conditioned. Adaptation of the covariance matrix amounts to learning a second order model of the underlying objective function similar to the approximation of the inverse Hessian matrix in the Quasi-Newton method in classical optimization. In contrast to most classical methods, fewer assumptions on the nature of the underlying objective function are made. Only the ranking between candidate solutions is exploited for learning the sample distribution and neither derivatives nor even the function values themselves are required by the method. (Source: http://plato.asu.edu)


References in zbMATH (referenced in 106 articles )

Showing results 1 to 20 of 106.
Sorted by year (citations)

1 2 3 4 5 6 next

  1. Chen, Huangke; Cheng, Ran; Wen, Jinming; Li, Haifeng; Weng, Jian: Solving large-scale many-objective optimization problems by covariance matrix adaptation evolution strategy with scalable small subpopulations (2020)
  2. Hellwig, Michael; Beyer, Hans-Georg: On the steady state analysis of covariance matrix self-adaptation evolution strategies on the noisy ellipsoid model (2020)
  3. Horváth, Gábor; Horváth, Illés; Telek, Miklós: High order concentrated matrix-exponential distributions (2020)
  4. Liang, Liang: A fusion multiobjective empire split algorithm (2020)
  5. Razaaly, Nassim; Persico, Giacomo; Gori, Giulio; Congedo, Pietro Marco: Quantile-based robust optimization of a supersonic nozzle for organic rankine cycle turbines (2020)
  6. Verma, Aekaansh; Wong, Kwai; Marsden, Alison L.: A concurrent implementation of the surrogate management framework with application to cardiovascular shape optimization (2020)
  7. Zhu, H.; Hu, Y. M.; Zhu, W. D.; Fan, W.; Zhou, B. W.: Multi-objective design optimization of an engine accessory drive system with a robustness analysis (2020)
  8. Breunig, U.; Baldacci, R.; Hartl, R. F.; Vidal, T.: The electric two-echelon vehicle routing problem (2019)
  9. Buet, Blanche; Mirebeau, Jean-Marie; van Gennip, Yves; Desquilbet, François; Dreo, Johann; Barbaresco, Frédéric; Leonardi, Gian Paolo; Masnou, Simon; Schönlieb, Carola-Bibiane: Partial differential equations and variational methods for geometric processing of images (2019)
  10. Gebhardt, Gregor H. W.; Kupcsik, Andras; Neumann, Gerhard: The kernel Kalman rule. Efficient nonparametric inference by recursive least-squares and subspace projections (2019)
  11. Łapa, Krystian: Meta-optimization of multi-objective population-based algorithms using multi-objective performance metrics (2019)
  12. Lehéricy, Luc: Consistent order estimation for nonparametric hidden Markov models (2019)
  13. Li, Wei: Matrix adaptation evolution strategy with multi-objective optimization for multimodal optimization (2019)
  14. Raponi, Elena; Bujny, Mariusz; Olhofer, Markus; Aulig, Nikola; Boria, Simonetta; Duddeck, Fabian: Kriging-assisted topology optimization of crash structures (2019)
  15. Tran, Anh; Sun, Jing; Furlan, John M.; Pagalthivarthi, Krishnan V.; Visintainer, Robert J.; Wang, Yan: pBO-2GP-3B: a batch parallel known/unknown constrained Bayesian optimization with feasibility classification and its applications in computational fluid dynamics (2019)
  16. Tsai, Hsing-Chih: Confined teaching-learning-based optimization with variable search strategies for continuous optimization (2019)
  17. Yang, Fan; Ren, Hu; Hu, Zhili: Maximum likelihood estimation for three-parameter Weibull distribution using evolutionary strategy (2019)
  18. Agathos, Konstantinos; Chatzi, Eleni; Bordas, Stéphane P. A.: Multiple crack detection in 3D using a stable XFEM and global optimization (2018)
  19. Arampatzis, Georgios; Wälchli, Daniel; Angelikopoulos, Panagiotis; Wu, Stephen; Hadjidoukas, Panagiotis; Koumoutsakos, Petros: Langevin diffusion for population based sampling with an application in Bayesian inference for pharmacodynamics (2018)
  20. Fujii, Garuda; Takahashi, Masayuki; Akimoto, Youhei: CMA-ES-based structural topology optimization using a level set boundary expression -- application to optical and carpet cloaks (2018)

1 2 3 4 5 6 next


Further publications can be found at: https://www.lri.fr/~hansen/publications.html#hansenReview2006