An ℓ 1 -oracle inequality for the Lasso in finite mixture of multivariate Gaussian regression models.

Emilie Devijver 1, 2
1 SELECT - Model selection in statistical learning
Inria Saclay - Ile de France, LMO - Laboratoire de Mathématiques d'Orsay, CNRS - Centre National de la Recherche Scientifique : UMR
Abstract : We consider a multivariate finite mixture of Gaussian regression models for high-dimensional data, where the number of covariates and the size of the response may be much larger than the sample size. We provide an ℓ 1 -oracle inequality satisfied by the Lasso estimator according to the Kullback-Leibler loss. This result is an extension of the ℓ 1 -oracle inequality established by Meynet in the multivariate case. We focus on the Lasso for its ℓ 1 -regularization properties rather than for the variable selection procedure, as it was done in Städler et al.
Document type :
Journal articles
Complete list of metadatas

Cited literature [16 references]  Display  Hide  Download

https://hal.inria.fr/hal-01075338
Contributor : Emilie Devijver <>
Submitted on : Thursday, January 14, 2016 - 1:24:10 PM
Last modification on : Thursday, February 7, 2019 - 2:52:22 PM
Long-term archiving on : Saturday, April 16, 2016 - 10:42:31 AM

Files

L1OracleInequalityArxiv.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01075338, version 2

Citation

Emilie Devijver. An ℓ 1 -oracle inequality for the Lasso in finite mixture of multivariate Gaussian regression models.. ESAIM: Probability and Statistics, EDP Sciences, 2015. ⟨hal-01075338v2⟩

Share

Metrics

Record views

174

Files downloads

188