# Model-Consistent Sparse Estimation through the Bootstrap

1 WILLOW - Models of visual object recognition and scene understanding
CNRS - Centre National de la Recherche Scientifique : UMR8548, Inria Paris-Rocquencourt, DI-ENS - Département d'informatique de l'École normale supérieure
Abstract : We consider the least-square linear regression problem with regularization by the $\ell^1$-norm, a problem usually referred to as the Lasso. In this paper, we first present a detailed asymptotic analysis of model consistency of the Lasso in low-dimensional settings. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection. For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection procedure, referred to as the Bolasso, is extended to high-dimensional settings by a provably consistent two-step procedure.
Keywords :
Document type :
Preprints, Working Papers, ...
Domain :

Cited literature [45 references]

https://hal.archives-ouvertes.fr/hal-00354771
Contributor : Francis Bach Connect in order to contact the contributor
Submitted on : Tuesday, January 20, 2009 - 8:56:36 PM
Last modification on : Friday, October 15, 2021 - 1:40:04 PM
Long-term archiving on: : Tuesday, June 8, 2010 - 5:26:25 PM

### Files

bolasso_hal_aos.pdf
Files produced by the author(s)

### Identifiers

• HAL Id : hal-00354771, version 1
• ARXIV : 0901.3202

### Citation

Francis Bach. Model-Consistent Sparse Estimation through the Bootstrap. 2009. ⟨hal-00354771⟩

Record views