Skip to Main content Skip to Navigation
Journal articles

Learning the Structure for Structured Sparsity

Nino Shervashidze 1, 2 Francis Bach 1, 2 
1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique - ENS Paris, Inria Paris-Rocquencourt, CNRS - Centre National de la Recherche Scientifique : UMR8548
Abstract : Structured sparsity has recently emerged in statistics, machine learning and signal processing as a promising paradigm for learning in high-dimensional settings. All existing methods for learning under the assumption of structured sparsity rely on prior knowledge on how to weight (or how to penalize) individual subsets of variables during the subset selection process, which is not available in general. Inferring group weights from data is a key open research problem in structured sparsity. In this paper, we propose a Bayesian approach to the problem of group weight learning. We model the group weights as hyperparameters of heavy-tailed priors on groups of variables and derive an approximate inference scheme to infer these hyperparameters. We empirically show that we are able to recover the model hyperparameters when the data are generated from the model, and we demonstrate the utility of learning weights in synthetic and real denoising problems.
Document type :
Journal articles
Complete list of metadata
Contributor : Nino Shervashidze Connect in order to contact the contributor
Submitted on : Tuesday, September 15, 2015 - 4:14:06 PM
Last modification on : Thursday, March 17, 2022 - 10:08:44 AM
Long-term archiving on: : Tuesday, December 29, 2015 - 7:18:29 AM


Files produced by the author(s)




Nino Shervashidze, Francis Bach. Learning the Structure for Structured Sparsity. IEEE Transactions on Signal Processing, Institute of Electrical and Electronics Engineers, 2015, 63 (18), pp.4894 - 4902. ⟨10.1109/TSP.2015.2446432⟩. ⟨hal-00986380v4⟩



Record views


Files downloads