Comparison of Neural Network Optimizers for Relative Ranking Retention Between Neural Architectures - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Comparison of Neural Network Optimizers for Relative Ranking Retention Between Neural Architectures

Résumé

Autonomous design and optimization of neural networks is gaining increasingly more attention from the research community. The main barrier is the computational resources required to conduct experimental and production project. Although most researchers focus on new design methodologies, the main computational cost remains the evaluation of candidate architectures. In this paper we investigate the feasibility of using reduced epoch training, by measuring the rank correlation coefficients between sets of optimizers, given a fixed number of training epochs. We discover ranking correlations of more than 0.75 and up to 0.964 between Adam with 50 training epochs, stochastic gradient descent with nesterov momentum with 10 training epochs and Adam with 20 training epochs. Moreover, we show the ability of genetic algorithms to find high-quality solutions of a function, by searching in a perturbed search space, given that certain correlation criteria are met.
Fichier principal
Vignette du fichier
483292_1_En_22_Chapter.pdf (415.92 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02331350 , version 1 (24-10-2019)

Licence

Paternité

Identifiants

Citer

George Kyriakides, Konstantinos Margaritis. Comparison of Neural Network Optimizers for Relative Ranking Retention Between Neural Architectures. 15th IFIP International Conference on Artificial Intelligence Applications and Innovations (AIAI), May 2019, Hersonissos, Greece. pp.272-281, ⟨10.1007/978-3-030-19823-7_22⟩. ⟨hal-02331350⟩
64 Consultations
41 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More