A step beyond local observations with a dialog aware bidirectional GRU network for Spoken Language Understanding

Abstract : Architectures of Recurrent Neural Networks (RNN) recently become a very popular choice for Spoken Language Understanding (SLU) problems; however, they represent a big family of different architectures that can furthermore be combined to form more complex neural networks. In this work, we compare different recurrent networks, such as simple Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM) networks, Gated Memory Units (GRU) and their bidirectional versions, on the popular ATIS dataset and on MEDIA, a more complex French dataset. Additionally, we propose a novel method where information about the presence of relevant word classes in the dialog history is combined with a bidirectional GRU, and we show that combining relevant word classes from the dialog history improves the performance over recurrent networks that work by solely analyzing the current sentence.
Type de document :
Communication dans un congrès
Interspeech, Sep 2016, San Francisco, United States
Liste complète des métadonnées


https://hal.inria.fr/hal-01351733
Contributeur : Vedran Vukotić <>
Soumis le : jeudi 4 août 2016 - 16:09:52
Dernière modification le : vendredi 17 février 2017 - 16:11:06
Document(s) archivé(s) le : mardi 8 novembre 2016 - 20:36:15

Fichier

Vukotic_Interspeech_2016.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01351733, version 1

Citation

Vedran Vukotic, Christian Raymond, Guillaume Gravier. A step beyond local observations with a dialog aware bidirectional GRU network for Spoken Language Understanding. Interspeech, Sep 2016, San Francisco, United States. <hal-01351733>

Partager

Métriques

Consultations de
la notice

645

Téléchargements du document

191