Off-the-grid learning of sparse mixtures from a continuous dictionary - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Off-the-grid learning of sparse mixtures from a continuous dictionary

Résumé

We consider a general non-linear model where the signal is a finite mixture of an unknown, possibly increasing, number of features issued from a continuous dictionary parameterized by a real nonlinear parameter. The signal is observed with Gaussian (possibly correlated) noise in either a continuous or a discrete setup. We propose an off-the-grid optimization method, that is, a method which does not use any discretization scheme on the parameter space, to estimate both the non-linear parameters of the features and the linear parameters of the mixture. We use recent results on the geometry of off-the-grid methods to give minimal separation on the true underlying non-linear parameters such that interpolating certificate functions can be constructed. Using also tail bounds for suprema of Gaussian processes we bound the prediction error with high probability. Assuming that the certificate functions can be constructed, our prediction error bound is up to log −factors similar to the rates attained by the Lasso predictor in the linear regression model. We also establish convergence rates that quantify with high probability the quality of estimation for both the linear and the non-linear parameters.
Fichier principal
Vignette du fichier
soumission_hal.pdf (596.41 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03707465 , version 1 (28-06-2022)

Identifiants

Citer

Cristina Butucea, Jean-François Delmas, Anne Dutfoy, Clément Hardy. Off-the-grid learning of sparse mixtures from a continuous dictionary. 2022. ⟨hal-03707465⟩
106 Consultations
87 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More