An Efficient Solution to Sparse Linear Prediction Analysis of Speech - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Journal Articles EURASIP Journal on Audio, Speech, and Music Processing Year : 2013

An Efficient Solution to Sparse Linear Prediction Analysis of Speech

Vahid Khanagha
  • Function : Author
  • PersonId : 865238
Khalid Daoudi

Abstract

We propose an efficient closed-form solution to the problem of sparse linear prediction analysis of the speech signal. Our method is based on minimization of a weighted l2-norm of the prediction error. The weighting function is constructed such that less emphasis is given to the error around the points where we expect the largest prediction errors to occur (the glottal closure instants) and hence the resulting cost function approaches the ideal l0-norm cost function for sparse residual recovery. We show that the minimization of such a mathematically tractable objective function (by solving normal equations of linear least squares problem) provides enhanced sparsity level of residuals compared to the l1-norm minimization approach which uses the computationally demanding convex optimization methods. Indeed, the computational complexity of the proposed method is roughly the same as the classic minimum variance linear prediction analysis approach. Moreover, to show a potential application of such sparse representation, we use the resulting linear prediction coefficients inside a multi-pulse coder and show that the resulting coder achieves better coding quality compared to the classical Multi-pulse Excitation coder which uses the traditional minimum variance synthesizer.

Dates and versions

hal-00709168 , version 1 (18-06-2012)

Identifiers

Cite

Vahid Khanagha, Khalid Daoudi. An Efficient Solution to Sparse Linear Prediction Analysis of Speech. EURASIP Journal on Audio, Speech, and Music Processing, 2013, 3, ⟨10.1186/1687-4722-2013-3⟩. ⟨hal-00709168⟩
227 View
0 Download

Altmetric

Share

Gmail Facebook X LinkedIn More