Some recovery conditions for basis learning by L1-minimization - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2008

Some recovery conditions for basis learning by L1-minimization

Karin Schnass
  • Fonction : Auteur
  • PersonId : 884521

Résumé

Many recent works have shown that if a given signal admits a sufficiently sparse representation in a given dictionary, then this representation is recovered by several standard opti- mization algorithms, in particular the convex L1 minimization approach. Here we investigate the related problem of infering the dictionary from training data, with an approach where L1- minimization is used as a criterion to select a dictionary. We restrict our analysis to basis learning and identify necessary / sufficient / necessary and sufficient conditions on ideal (not necessarily very sparse) coefficients of the training data in an ideal basis to guarantee that the ideal basis is a strict local optimum of the L1-minimization criterion among (not necessarily orthogonal) bases of normalized vectors. We illustrate these conditions on deterministic as well as toy random models in dimension two and highlight the main challenges that remain open by this preliminary theoretical results.
Fichier principal
Vignette du fichier
2008_ISCCSP_GribonvalSchnass_Recovery_submitted.pdf (340.11 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

inria-00544763 , version 1 (27-01-2011)

Identifiants

Citer

Rémi Gribonval, Karin Schnass. Some recovery conditions for basis learning by L1-minimization. 3rd IEEE International Symposium on Communications, Control and Signal Processing (ISCCSP 2008), Mar 2008, St. Julians, Malta. pp.768--773, ⟨10.1109/ISCCSP.2008.4537326⟩. ⟨inria-00544763⟩
208 Consultations
212 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More