Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Toward Optimal Run Racing: Application to Deep Learning Calibration

Abstract : This paper aims at one-shot learning of deep neural nets, where a highly parallel setting is considered to address the algorithm calibration problem - selecting the best neural architecture and learning hyper-parameter values depending on the dataset at hand. The notoriously expensive calibration problem is optimally reduced by detecting and early stopping non-optimal runs. The theoretical contribution regards the optimality guarantees within the multiple hypothesis testing framework. Experimentations on the Cifar10, PTB and Wiki benchmarks demonstrate the relevance of the approach with a principled and consistent improvement on the state of the art with no extra hyper-parameter.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

https://hal.inria.fr/hal-01634381
Contributor : Marc Schoenauer <>
Submitted on : Tuesday, November 14, 2017 - 9:27:48 AM
Last modification on : Wednesday, September 16, 2020 - 5:51:00 PM

Links full text

Identifiers

  • HAL Id : hal-01634381, version 1
  • ARXIV : 1706.03199

Citation

Olivier Bousquet, Sylvain Gelly, Karol Kurach, Marc Schoenauer, Michèle Sebag, et al.. Toward Optimal Run Racing: Application to Deep Learning Calibration. 2017. ⟨hal-01634381⟩

Share

Metrics

Record views

242