Skip to Main content Skip to Navigation
Journal articles

COCO: A Platform for Comparing Continuous Optimizers in a Black-Box Setting

Abstract : We introduce COCO, an open source platform for Comparing Continuous Optimizers in a black-box setting. COCO aims at automatizing the tedious and repetitive task of benchmarking numerical optimization algorithms to the greatest possible extent. The platform and the underlying methodology allow to benchmark in the same framework deterministic and stochastic solvers for both single and multiobjective optimization. We present the rationales behind the (decade-long) development of the platform as a general proposition for guidelines towards better benchmarking. We detail underlying fundamental concepts of COCO such as the definition of a problem as a function instance, the underlying idea of instances, the use of target values, and runtime defined by the number of function calls as the central performance measure. Finally, we give a quick overview of the basic code structure and the currently available test suites.
Complete list of metadata

Cited literature [76 references]  Display  Hide  Download

https://hal.inria.fr/hal-01294124
Contributor : Nikolaus Hansen <>
Submitted on : Wednesday, August 26, 2020 - 6:01:41 PM
Last modification on : Tuesday, February 9, 2021 - 11:38:02 AM

Files

coco-doc.pdf
Files produced by the author(s)

Identifiers

Collections

Citation

Nikolaus Hansen, Anne Auger, Raymond Ros, Olaf Mersmann, Tea Tušar, et al.. COCO: A Platform for Comparing Continuous Optimizers in a Black-Box Setting. Optimization Methods and Software, Taylor & Francis, In press, 36 (1), pp.114-144. ⟨10.1080/10556788.2020.1808977⟩. ⟨hal-01294124v4⟩

Share

Metrics

Record views

205

Files downloads

566