Unsupervised Layer-Wise Model Selection in Deep Neural Networks - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2010

Unsupervised Layer-Wise Model Selection in Deep Neural Networks

Résumé

Deep Neural Networks (DNN) propose a new and efficient ML architecture based on the layer-wise building of several representation layers. A critical issue for DNNs remains model selection, e.g. selecting the number of neurons in each DNN layer. The hyper-parameter search space exponentially increases with the number of layers, making the popular grid search-based approach used for finding good hyper-parameter values intractable. The question investigated in this paper is whether the unsupervised, layer-wise methodology used to train a DNN can be extended to model selection as well. The proposed approach, considering an unsupervised criterion, empirically examines whether model selection is a modular optimization problem, and can be tackled in a layer-wise manner. Preliminary results on the MNIST data set suggest the answer is positive. Further, some unexpected results regarding the optimal size of layers depending on the training process, are reported and discussed.
Fichier principal
Vignette du fichier
ECAI-632.pdf (537.99 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-00488338 , version 1 (01-06-2010)

Identifiants

Citer

Ludovic Arnold, Hélène Paugam-Moisy, Michèle Sebag. Unsupervised Layer-Wise Model Selection in Deep Neural Networks. 19th European Conference on Artificial Intelligence (ECAI'10), Aug 2010, Lisbon, Portugal. ⟨10.3233/978-1-60750-606-5-915⟩. ⟨hal-00488338⟩
767 Consultations
268 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More