Skip to Main content Skip to Navigation
New interface
Conference papers

Beyond Tikhonov: Faster Learning with Self-Concordant Losses via Iterative Regularization

Gaspard Beugnot 1, 2 Julien Mairal 1 Alessandro Rudi 2 
1 Thoth - Apprentissage de modèles à partir de données massives
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann
2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique - ENS Paris, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : The theory of spectral filtering is a remarkable tool to understand the statistical properties of learning with kernels. For least squares, it allows to derive various regularization schemes that yield faster convergence rates of the excess risk than with Tikhonov regularization. This is typically achieved by leveraging classical assumptions called source and capacity conditions, which characterize the difficulty of the learning task. In order to understand estimators derived from other loss functions, Marteau-Ferey et al. have extended the theory of Tikhonov regularization to generalized self concordant loss functions (GSC), which contain, e.g., the logistic loss. In this paper, we go a step further and show that fast and optimal rates can be achieved for GSC by using the iterated Tikhonov regularization scheme, which is intrinsically related to the proximal point method in optimization, and overcomes the limitation of the classical Tikhonov regularization.
Complete list of metadata

https://hal.inria.fr/hal-03406072
Contributor : Gaspard Beugnot Connect in order to contact the contributor
Submitted on : Wednesday, October 27, 2021 - 4:11:06 PM
Last modification on : Wednesday, June 8, 2022 - 12:50:06 PM

Links full text

Identifiers

  • HAL Id : hal-03406072, version 1
  • ARXIV : 2106.08855

Collections

Citation

Gaspard Beugnot, Julien Mairal, Alessandro Rudi. Beyond Tikhonov: Faster Learning with Self-Concordant Losses via Iterative Regularization. NeurIPS 2021 – 35th Annual Conference on Neural Information Processing Systems, Dec 2021, Virtual, France. pp.1-37. ⟨hal-03406072⟩

Share

Metrics

Record views

71