Toward Optimal Run Racing: Application to Deep Learning Calibration

Olivier Bousquet 1 Sylvain Gelly 1 Karol Kurach 1 Marc Schoenauer 2, 3 Michèle Sebag 3, 2 Olivier Teytaud 1 Damien Vincent 1
2 TAU - TAckling the Underspecified
LRI - Laboratoire de Recherche en Informatique, UP11 - Université Paris-Sud - Paris 11, Inria Saclay - Ile de France, CNRS - Centre National de la Recherche Scientifique : UMR8623
Abstract : This paper aims at one-shot learning of deep neural nets, where a highly parallel setting is considered to address the algorithm calibration problem - selecting the best neural architecture and learning hyper-parameter values depending on the dataset at hand. The notoriously expensive calibration problem is optimally reduced by detecting and early stopping non-optimal runs. The theoretical contribution regards the optimality guarantees within the multiple hypothesis testing framework. Experimentations on the Cifar10, PTB and Wiki benchmarks demonstrate the relevance of the approach with a principled and consistent improvement on the state of the art with no extra hyper-parameter.
Type de document :
Pré-publication, Document de travail
Liste complète des métadonnées
Contributeur : Marc Schoenauer <>
Soumis le : mardi 14 novembre 2017 - 09:27:48
Dernière modification le : mardi 8 janvier 2019 - 08:36:01

Lien texte intégral


  • HAL Id : hal-01634381, version 1
  • ARXIV : 1706.03199


Olivier Bousquet, Sylvain Gelly, Karol Kurach, Marc Schoenauer, Michèle Sebag, et al.. Toward Optimal Run Racing: Application to Deep Learning Calibration. 2017. 〈hal-01634381〉



Consultations de la notice