# Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance

2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We consider learning methods based on the regularization of a convex empirical risk by a squared Hilbertian norm, a setting that includes linear predictors and non-linear predictors through positive-definite kernels. In order to go beyond the generic analysis leading to convergence rates of the excess risk as $O(1/\sqrt{n})$ from $n$ observations, we assume that the individual losses are self-concordant, that is, their third-order derivatives are bounded by their second-order derivatives. This setting includes least-squares, as well as all generalized linear models such as logistic and softmax regression. For this class of losses, we provide a bias-variance decomposition and show that the assumptions commonly made in least-squares regression, such as the source and capacity conditions, can be adapted to obtain fast non-asymptotic rates of convergence by improving the bias terms, the variance terms or both.
Keywords :
Document type :
Preprints, Working Papers, ...
Domain :

Cited literature [26 references]

https://hal.inria.fr/hal-02011895
Contributor : Ulysse Marteau-Ferey <>
Submitted on : Monday, June 17, 2019 - 12:06:49 PM
Last modification on : Tuesday, May 4, 2021 - 2:06:02 PM

### Files

main_arxiv.pdf
Files produced by the author(s)

### Identifiers

• HAL Id : hal-02011895, version 3
• ARXIV : 1902.03046

### Citation

Ulysse Marteau-Ferey, Dmitrii Ostrovskii, Francis Bach, Alessandro Rudi. Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance. 2019. ⟨hal-02011895v3⟩

Record views