Seq2Biseq: Bidirectional Output-wise Recurrent Neural Networks for Sequence Modelling - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Seq2Biseq: Bidirectional Output-wise Recurrent Neural Networks for Sequence Modelling

Résumé

During the last couple of years, Recurrent Neural Networks (RNN) have reached state-of-the-art performances on most of the sequence modelling problems. In particular, the sequence to sequence model and the neural CRF have proved to be very effective in this domain. In this article, we propose a new RNN architecture for sequence labelling, leveraging gated recurrent layers to take arbitrarily long contexts into account, and using two decoders operating forward and backward. We compare several variants of the proposed solution and their performances to the state-of-the-art. Most of our results are better than the state-of-the-art or very close to it and thanks to the use of recent technologies, our architecture can scale on corpora larger than those used in this work.
Fichier principal
Vignette du fichier
main.pdf (145.66 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02085093 , version 1 (30-03-2019)

Licence

Paternité - Partage selon les Conditions Initiales

Identifiants

  • HAL Id : hal-02085093 , version 1

Citer

Marco Dinarelli, Loïc Grobol. Seq2Biseq: Bidirectional Output-wise Recurrent Neural Networks for Sequence Modelling. CICLing 2019 - 20th International Conference on Computational Linguistics and Intelligent Text Processing, Apr 2019, La Rochelle, France. ⟨hal-02085093⟩
123 Consultations
268 Téléchargements

Partager

Gmail Facebook X LinkedIn More