Mixture Model-CMA-ES
Résumé
We report on our attempt to improve the CMA-ES global optimization algorithm based on two ideas: first the use of Sobol's quasi-random low discrepancy numbers instead of pseudo-random numbers, second the design of an alternative to sequential restarts to dynamically adapt the population size, using a mixture model extension of CMA-ES (MM-CMA-ES). On the standard Coco benchmark for evaluating global stochastic optimization methods, the use of Sobol numbers shows a quite uniform improvement, as was already shown by Teytaud last year. On the other hand, MM-CMA-ES does not show speed-up w.r.t. CMA-ES with IPOP restart strategy, even on objective functions with many local minima such as the Rastrigin function. The reason is the overhead in the number of evaluation of the objective functions, introduced by the MM strategy, and the very subtle effect of the adaptive step size strategy of CMA-ES to escape from the covering of several local minima by one (large) normal distribution. We conclude on some perspectives for improvement.
Domaines
Informatique [cs]
Origine : Fichiers produits par l'(les) auteur(s)