Modeling Labial Coarticulation with Bidirectional Gated Recurrent Networks and Transfer Learning - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Modeling Labial Coarticulation with Bidirectional Gated Recurrent Networks and Transfer Learning

Résumé

In this study, we investigate how to learn labial coarticula-tion to generate a sparse representation of the face from speech. To do so, we experiment a sequential deep learning model, bidi-rectional gated recurrent networks, which have reached nice result in addressing the articulatory inversion problem and so should be able to handle coarticulation effects. As acquiring audiovisual corpora is expensive and time-consuming, we designed our solution to counteract the lack of data. Firstly, we have used phonetic information (phoneme label and respective duration) as input to ensure speaker independence, and in second hand, we have experimented around pretraining strategies to reach acceptable performances. We demonstrate how a careful initialization of the last layers of the network can greatly ease the training and help to handle coarticulation effect. This initialization relies on dimensionality reduction strategies, allowing injecting knowledge of useful latent representation of the visual data into the network. We focused on two data-driven tools (PCA and autoencoder) and one hand-crafted latent space coming from animation community, blendshapes decomposition. We have trained and evaluated the model with a corpus consisting of 4 hours of French speech, and we have gotten an average RMSE close to 1.3mm.
Fichier principal
Vignette du fichier
2097_Paper.pdf (1.25 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02175780 , version 1 (06-07-2019)

Identifiants

  • HAL Id : hal-02175780 , version 1

Citer

Théo Biasutto--Lervat, Sara Dahmani, Slim Ouni. Modeling Labial Coarticulation with Bidirectional Gated Recurrent Networks and Transfer Learning. INTERSPEECH 2019 - 20th Annual Conference of the International Speech Communication Association, Sep 2019, Graz, Austria. ⟨hal-02175780⟩
195 Consultations
312 Téléchargements

Partager

Gmail Facebook X LinkedIn More