Extreme Compass and Dynamic Multi-Armed Bandits for Adaptive Operator Selection

Jorge Maturana 1 Álvaro Fialho 2 Frédéric Saubion 1 Marc Schoenauer 2, 3, 4 Michèle Sebag 2, 3, 4
4 TAO - Machine Learning and Optimisation
CNRS - Centre National de la Recherche Scientifique : UMR8623, Inria Saclay - Ile de France, UP11 - Université Paris-Sud - Paris 11, LRI - Laboratoire de Recherche en Informatique
Abstract : The goal of Adaptive Operator Selection (AOS) is the on-line control of the choice of variation operators within Evolutionary Algorithms. The control process is based on two main components, the credit assignment, that defines the reward that will be used to evaluate the quality of an operator after it has been applied, and the operator selection mechanism, that selects one operator based on some operators qualities. Two previously developed AOS methods are combined here: Compass evaluates the performance of operators by considering not only the fitness improvements from parent to offspring, but also the way they modify the diversity of the population, and their execution time; Dynamic Multi-Armed Bandit proposes a selection strategy based on the well-known UCB algorithm, achieving a compromise between exploitation and exploration, while nevertheless quickly adapting to changes. Tests with the proposed method, called ExCoDyMAB, are carried out using several hard instances of the Satisfiability problem (SAT). Results show the good synergetic effect of combining both approaches.
Document type :
Conference papers
Complete list of metadatas

https://hal.inria.fr/inria-00377450
Contributor : Álvaro Fialho <>
Submitted on : Wednesday, April 22, 2009 - 3:14:20 PM
Last modification on : Monday, December 9, 2019 - 5:24:06 PM
Long-term archiving on: Thursday, June 10, 2010 - 9:23:43 PM

File

PaperCEC09.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : inria-00377450, version 1

Citation

Jorge Maturana, Álvaro Fialho, Frédéric Saubion, Marc Schoenauer, Michèle Sebag. Extreme Compass and Dynamic Multi-Armed Bandits for Adaptive Operator Selection. IEEE Congress on Evolutionary Computation, May 2009, Trondheim, Norway. ⟨inria-00377450v1⟩

Share

Metrics

Record views

18

Files downloads

78