On Proving Linear Convergence of Comparison-based Step-size Adaptive Randomized Search on Scaling-Invariant Functions via Stability of Markov Chains

Anne Auger 1 Nikolaus Hansen 1
1 TAO - Machine Learning and Optimisation
CNRS - Centre National de la Recherche Scientifique : UMR8623, Inria Saclay - Ile de France, UP11 - Université Paris-Sud - Paris 11, LRI - Laboratoire de Recherche en Informatique
Abstract : In the context of numerical optimization, this paper develops a methodology to analyze the linear convergence of comparison-based step-size adaptive randomized search (CB-SARS), a class of probabilistic derivative-free optimization algorithms where the function is solely used through comparisons of candidate solutions. Various algorithms are included in the class of CB-SARS algorithms. On the one hand, a few methods introduced already in the 60's: the step-size adaptive random search by Schumer and Steiglitz, the compound random search by Devroye and simplified versions of Matyas' random optimization algorithm or Kjellstrom and Taxen Gaussian adaptation. On the other hand, it includes simplified versions of several recent algorithms: the covariance-matrix-adaptation evolution strategy algorithm (CMA-ES), the exponential natural evolution strategy (xNES), or the cross entropy method. CB-SARS algorithms typically exhibit several invariances. First of all, invariance to composing the objective function with a strictly monotonic transformation which is a direct consequence of the fact that the algorithms only use comparisons. Second, scale invariance that translates the fact that the algorithm has no intrinsic absolute notion of scale. The algorithms are investigated on scaling-invariant functions defined as functions that preserve the ordering of two points (with respect to their objective function values) when they are both scaled with the same positive parameter (this scaling is done w.r.t. a single constant reference point). This class of functions includes composite of norms by strictly increasing functions as well as some non quasi-convex functions and non-continuous functions. We prove that the algorithm invariance properties entail on scaling-invariant functions the existence of a homogeneous Markov chain whose stability implies linear convergence of the original algorithm. Hence we turn the problem of studying linear convergence of CB-SARS optimization algorithms into a study of the stability of a joint Markov chain. We naturally derive sufficient conditions expressed in terms of different stability conditions for the Markov chain (irreducibility, positivity, Harris recurrence, geometric ergodicity) for the global linear convergence of CB-SARS. The methodology presented here is applied in a companion paper where stability sufficient conditions are proven for the (1+1)-ES with a generalized one-fifth success rule that coincides with Schumer and Steiglitz' or Devroye's algorithms. This leads to a first proof of linear convergence for the algorithm considered and first proof of linear convergence for a scale-invariant CB-SARS for such a wide family of functions including in particular non quasi-convex and non continuous functions.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

https://hal.inria.fr/hal-00877160
Contributor : Anne Auger <>
Submitted on : Sunday, October 27, 2013 - 12:29:25 AM
Last modification on : Sunday, July 21, 2019 - 1:48:11 AM
Long-term archiving on : Friday, April 7, 2017 - 6:05:01 PM

File

AugerHansenLinearCV-CB-SARS-Oc...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00877160, version 1

Citation

Anne Auger, Nikolaus Hansen. On Proving Linear Convergence of Comparison-based Step-size Adaptive Randomized Search on Scaling-Invariant Functions via Stability of Markov Chains. 2013. ⟨hal-00877160v1⟩

Share

Metrics

Record views

172

Files downloads

23