Mixture Model-CMA-ES

Abstract : We report on our attempt to improve the CMA-ES global optimization algorithm based on two ideas: first the use of Sobol's quasi-random low discrepancy numbers instead of pseudo-random numbers, second the design of an alternative to sequential restarts to dynamically adapt the population size, using a mixture model extension of CMA-ES (MM-CMA-ES). On the standard Coco benchmark for evaluating global stochastic optimization methods, the use of Sobol numbers shows a quite uniform improvement, as was already shown by Teytaud last year. On the other hand, MM-CMA-ES does not show speed-up w.r.t. CMA-ES with IPOP restart strategy, even on objective functions with many local minima such as the Rastrigin function. The reason is the overhead in the number of evaluation of the objective functions, introduced by the MM strategy, and the very subtle effect of the adaptive step size strategy of CMA-ES to escape from the covering of several local minima by one (large) normal distribution. We conclude on some perspectives for improvement.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

https://hal.inria.fr/hal-01420342
Contributor : François Fages <>
Submitted on : Wednesday, December 21, 2016 - 1:33:24 PM
Last modification on : Tuesday, April 17, 2018 - 9:08:09 AM
Long-term archiving on : Tuesday, March 21, 2017 - 12:04:32 AM

File

RapportMMCMAES.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01420342, version 1

Collections

Citation

Nicolas Vasselin, François Fages. Mixture Model-CMA-ES . 2016. ⟨hal-01420342⟩

Share

Metrics

Record views

322

Files downloads

230