Analyzing the discrepancy principle for kernelized spectral filter learning algorithms - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Article Dans Une Revue Journal of Machine Learning Research Année : 2021

Analyzing the discrepancy principle for kernelized spectral filter learning algorithms

Analyse d'une règle d'arrêt prématuré basée sur le principe de discrépance pour les algorithmes à filtrage spectral

Résumé

We investigate the construction of early stopping rules in the non-parametric regression problem where iterative learning algorithms are used and the optimal iteration number is unknown. More precisely, we study the discrepancy principle, as well as modifications based on smoothed residuals, for kernelized spectral filter learning algorithms including gradient descent. Our main theoretical bounds are oracle inequalities established for the empirical estimation error (fixed design), and for the prediction error (random design). From these finite-sample bounds it follows that the classical discrepancy principle is statistically adaptive for slow rates occurring in the hard learning scenario, while the smoothed discrepancy principles are adaptive over ranges of faster rates (resp. higher smoothness parameters). Our approach relies on deviation inequalities for the stopping rules in the fixed design setting, combined with change-of-norm arguments to deal with the random design setting.
Fichier principal
Vignette du fichier
20-358.pdf (561.25 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Licence : Copyright (Tous droits réservés)

Dates et versions

hal-02548917 , version 1 (21-04-2020)
hal-02548917 , version 2 (07-03-2023)

Identifiants

Citer

Alain Celisse, Martin Wahl. Analyzing the discrepancy principle for kernelized spectral filter learning algorithms. Journal of Machine Learning Research, 2021. ⟨hal-02548917v2⟩
174 Consultations
192 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More