Skip to Main content Skip to Navigation
Conference papers

Differential evolution for strongly noisy optimization: Use 1.01n resamplings at iteration n and reach the − 1/2 slope

Shih-Yuan Chiu 1 Ching-Nung Lin 2 Jialin Liu 3, 4 Tsang-Cheng Su 1 Fabien Teytaud 5 Olivier Teytaud 3 Shi-Jim Yen 1
3 TAO - Machine Learning and Optimisation
CNRS - Centre National de la Recherche Scientifique : UMR8623, Inria Saclay - Ile de France, UP11 - Université Paris-Sud - Paris 11, LRI - Laboratoire de Recherche en Informatique
Abstract : This paper is devoted to noisy optimization in case of a noise with standard deviation as large as variations of the fitness values, specifically when the variance does not decrease to zero around the optimum. We focus on comparing methods for choosing the number of resamplings. Experiments are performed on the differential evolution algorithm. By mathematical analysis, we design a new rule for choosing the number of resamplings for noisy optimization, as a function of the dimension, and validate its efficiency compared to existing heuristics.
Complete list of metadata

Cited literature [33 references]  Display  Hide  Download

https://hal.inria.fr/hal-01245526
Contributor : Jialin Liu <>
Submitted on : Thursday, December 17, 2015 - 3:12:53 PM
Last modification on : Sunday, March 7, 2021 - 12:14:01 AM

Identifiers

Citation

Shih-Yuan Chiu, Ching-Nung Lin, Jialin Liu, Tsang-Cheng Su, Fabien Teytaud, et al.. Differential evolution for strongly noisy optimization: Use 1.01n resamplings at iteration n and reach the − 1/2 slope. 2015 IEEE Congress on Evolutionary Computation (IEEE CEC 2015), May 2015, Sendai, Japan. pp.338 - 345, ⟨10.1109/CEC.2015.7256911⟩. ⟨hal-01245526⟩

Share

Metrics

Record views

474