Mixture Model-CMA-ES

Abstract : We report on our attempt to improve the CMA-ES global optimization algorithm based on two ideas: first the use of Sobol's quasi-random low discrepancy numbers instead of pseudo-random numbers, second the design of an alternative to sequential restarts to dynamically adapt the population size, using a mixture model extension of CMA-ES (MM-CMA-ES). On the standard Coco benchmark for evaluating global stochastic optimization methods, the use of Sobol numbers shows a quite uniform improvement, as was already shown by Teytaud last year. On the other hand, MM-CMA-ES does not show speed-up w.r.t. CMA-ES with IPOP restart strategy, even on objective functions with many local minima such as the Rastrigin function. The reason is the overhead in the number of evaluation of the objective functions, introduced by the MM strategy, and the very subtle effect of the adaptive step size strategy of CMA-ES to escape from the covering of several local minima by one (large) normal distribution. We conclude on some perspectives for improvement.
Type de document :
Pré-publication, Document de travail
internship report. 2016
Liste complète des métadonnées

https://hal.inria.fr/hal-01420342
Contributeur : François Fages <>
Soumis le : mercredi 21 décembre 2016 - 13:33:24
Dernière modification le : mardi 17 avril 2018 - 09:08:09
Document(s) archivé(s) le : mardi 21 mars 2017 - 00:04:32

Fichier

RapportMMCMAES.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01420342, version 1

Collections

Citation

Nicolas Vasselin, François Fages. Mixture Model-CMA-ES . internship report. 2016. 〈hal-01420342〉

Partager

Métriques

Consultations de la notice

271

Téléchargements de fichiers

130