Mixture Model-CMA-ES - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2016

Mixture Model-CMA-ES

Résumé

We report on our attempt to improve the CMA-ES global optimization algorithm based on two ideas: first the use of Sobol's quasi-random low discrepancy numbers instead of pseudo-random numbers, second the design of an alternative to sequential restarts to dynamically adapt the population size, using a mixture model extension of CMA-ES (MM-CMA-ES). On the standard Coco benchmark for evaluating global stochastic optimization methods, the use of Sobol numbers shows a quite uniform improvement, as was already shown by Teytaud last year. On the other hand, MM-CMA-ES does not show speed-up w.r.t. CMA-ES with IPOP restart strategy, even on objective functions with many local minima such as the Rastrigin function. The reason is the overhead in the number of evaluation of the objective functions, introduced by the MM strategy, and the very subtle effect of the adaptive step size strategy of CMA-ES to escape from the covering of several local minima by one (large) normal distribution. We conclude on some perspectives for improvement.
Fichier principal
Vignette du fichier
RapportMMCMAES.pdf (4.04 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01420342 , version 1 (21-12-2016)

Identifiants

  • HAL Id : hal-01420342 , version 1

Citer

Nicolas Vasselin, François Fages. Mixture Model-CMA-ES . 2016. ⟨hal-01420342⟩
213 Consultations
175 Téléchargements

Partager

Gmail Facebook X LinkedIn More