Modeling dependency between regression classes in MLLR using multiscale autoregressive models

Christophe Cerisara 1 Khalid Daoudi 1
1 PAROLE - Analysis, perception and recognition of speech
INRIA Lorraine, LORIA - Laboratoire Lorrain de Recherche en Informatique et ses Applications
Abstract : Adapting acoustic models to a new environment is usually realized by considering model transformations that are estimated on the adaptation corpus. Since such a corpus usually contains very few data, the models' Gaussians are most often partitioned into a few regression classes, and all the Gaussians in the same class share the same transformation. It is further possible to increase the number of transformations by modeling the dependency between the regression classes. We present, in this paper, such a technique where dependency is modeled by multiscale autoregressive (MAR) processes. The power of the MAR framework resides in its ability to efficiently and optimally estimate the state vector at each node of the regression tree, based on sparse and noisy measurements at different resolutions. The method is evaluated on a french numbers recognition task where the test corpus has been recorded in a car at various speeds and noise levels. The proposed adaptation method is based on Maximum Likelihood Linear Regression.
Type de document :
Communication dans un congrès
ISCA Workshop on Adaptation methods for speech recognition, Aug 2001, Sophia-Antipolis, France, 4 p, 2001
Liste complète des métadonnées

https://hal.inria.fr/inria-00101091
Contributeur : Publications Loria <>
Soumis le : mardi 26 septembre 2006 - 14:56:27
Dernière modification le : vendredi 9 février 2018 - 13:20:01

Identifiants

  • HAL Id : inria-00101091, version 1

Collections

Citation

Christophe Cerisara, Khalid Daoudi. Modeling dependency between regression classes in MLLR using multiscale autoregressive models. ISCA Workshop on Adaptation methods for speech recognition, Aug 2001, Sophia-Antipolis, France, 4 p, 2001. 〈inria-00101091〉

Partager

Métriques

Consultations de la notice

247