Can recurrent neural networks warp time? - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2018

Can recurrent neural networks warp time?

Corentin Tallec
  • Fonction : Auteur
  • PersonId : 1032966

Résumé

Successful recurrent models such as long short-term memories (LSTMs) and gated recurrent units (GRUs) use ad hoc gating mechanisms. Empirically these models have been found to improve the learning of medium to long term temporal dependencies and to help with vanishing gradient issues. We prove that learnable gates in a recurrent model formally provide quasi-invariance to general time transformations in the input data. We recover part of the LSTM architecture from a simple axiomatic approach. This result leads to a new way of initializing gate biases in LSTMs and GRUs. Experimentally , this new chrono initialization is shown to greatly improve learning of long term dependencies, with minimal implementation effort. Recurrent neural networks (e.g. (Jaeger, 2002)) are a standard machine learning tool to model and represent temporal data; mathematically they amount to learning the parameters of a parameterized dynamical system so that its behavior optimizes some criterion, such as the prediction of the next data in a sequence.
Fichier principal
Vignette du fichier
iclr_chrono.pdf (973.21 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01812064 , version 1 (11-06-2018)

Identifiants

  • HAL Id : hal-01812064 , version 1

Citer

Corentin Tallec, Yann Ollivier. Can recurrent neural networks warp time?. International Conference on Learning Representation 2018, Apr 2018, Vancouver, France. ⟨hal-01812064⟩
191 Consultations
207 Téléchargements

Partager

Gmail Facebook X LinkedIn More