Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2019

Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance

Résumé

We consider learning methods based on the regularization of a convex empirical risk by a squared Hilbertian norm, a setting that includes linear predictors and non-linear predictors through positive-definite kernels. In order to go beyond the generic analysis leading to convergence rates of the excess risk as $O(1/\sqrt{n})$ from $n$ observations, we assume that the individual losses are self-concordant, that is, their third-order derivatives are bounded by their second-order derivatives. This setting includes least-squares, as well as all generalized linear models such as logistic and softmax regression. For this class of losses, we provide a bias-variance decomposition and show that the assumptions commonly made in least-squares regression, such as the source and capacity conditions, can be adapted to obtain fast non-asymptotic rates of convergence by improving the bias terms, the variance terms or both.
Fichier principal
Vignette du fichier
main_arxiv.pdf (455.36 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02011895 , version 1 (08-02-2019)
hal-02011895 , version 2 (13-02-2019)
hal-02011895 , version 3 (17-06-2019)

Identifiants

Citer

Ulysse Marteau-Ferey, Dmitrii M. Ostrovskii, Francis Bach, Alessandro Rudi. Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance. 2019. ⟨hal-02011895v3⟩
311 Consultations
494 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More