Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance

Ulysse Marteau-Ferey 1, 2 Dmitrii M. Ostrovskii 1, 2 Francis Bach 1, 2 Alessandro Rudi 1, 2 
2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique - ENS Paris, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We consider learning methods based on the regularization of a convex empirical risk by a squared Hilbertian norm, a setting that includes linear predictors and non-linear predictors through positive-definite kernels. In order to go beyond the generic analysis leading to convergence rates of the excess risk as $O(1/\sqrt{n})$ from $n$ observations, we assume that the individual losses are self-concordant, that is, their third-order derivatives are bounded by their second-order derivatives. This setting includes least-squares, as well as all generalized linear models such as logistic and softmax regression. For this class of losses, we provide a bias-variance decomposition and show that the assumptions commonly made in least-squares regression, such as the source and capacity conditions, can be adapted to obtain fast non-asymptotic rates of convergence by improving the bias terms, the variance terms or both.
Complete list of metadata

Cited literature [26 references]  Display  Hide  Download
Contributor : Ulysse Marteau-Ferey Connect in order to contact the contributor
Submitted on : Monday, June 17, 2019 - 12:06:49 PM
Last modification on : Wednesday, June 8, 2022 - 12:50:06 PM


Files produced by the author(s)


  • HAL Id : hal-02011895, version 3
  • ARXIV : 1902.03046



Ulysse Marteau-Ferey, Dmitrii M. Ostrovskii, Francis Bach, Alessandro Rudi. Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance. 2019. ⟨hal-02011895v3⟩



Record views


Files downloads