Structured sparsity through convex optimization - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Preprints, Working Papers, ... Year : 2011

Structured sparsity through convex optimization

Abstract

Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. While naturally cast as a combinatorial optimization problem, variable or feature selection admits a convex relaxation through the regularization by the $\ell_1$-norm. In this paper, we consider situations where we are not only interested in sparsity, but where some structural prior knowledge is available as well. We show that the $\ell_1$-norm can then be extended to structured norms built on either disjoint or overlapping groups of variables, leading to a flexible framework that can deal with various structures. We present applications to supervised learning in the context of non-linear variable selection, and to unsupervised learning, for structured sparse principal component analysis, and hierarchical dictionary learning.
Fichier principal
Vignette du fichier
structured_sparsity_hal.pdf (691.39 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-00621245 , version 1 (09-09-2011)
hal-00621245 , version 2 (20-04-2012)

Identifiers

Cite

Francis Bach, Rodolphe Jenatton, Julien Mairal, Guillaume Obozinski. Structured sparsity through convex optimization. 2011. ⟨hal-00621245v1⟩
508 View
601 Download

Altmetric

Share

Gmail Facebook X LinkedIn More