Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Analyzing the discrepancy principle for kernelized spectral filter learning algorithms

Alain Celisse 1 Martin Wahl 2 
1 MODAL - MOdel for Data Analysis and Learning
LPP - Laboratoire Paul Painlevé - UMR 8524, Université de Lille, Sciences et Technologies, Inria Lille - Nord Europe, METRICS - Evaluation des technologies de santé et des pratiques médicales - ULR 2694, Polytech Lille - École polytechnique universitaire de Lille
Abstract : We investigate the construction of early stopping rules in the non-parametric regression problem where iterative learning algorithms are used and the optimal iteration number is unknown. More precisely, we study the discrepancy principle, as well as modifications based on smoothed residuals, for kernelized spectral filter learning algorithms including gradient descent. Our main theoretical bounds are oracle inequalities established for the empirical estimation error (fixed design), and for the prediction error (random design). From these finite-sample bounds it follows that the classical discrepancy principle is statistically adaptive for slow rates occurring in the hard learning scenario, while the smoothed discrepancy principles are adaptive over ranges of faster rates (resp. higher smoothness parameters). Our approach relies on deviation inequalities for the stopping rules in the fixed design setting, combined with change-of-norm arguments to deal with the random design setting.
Complete list of metadata

Cited literature [58 references]  Display  Hide  Download
Contributor : Alain Celisse Connect in order to contact the contributor
Submitted on : Tuesday, April 21, 2020 - 9:25:41 AM
Last modification on : Thursday, March 24, 2022 - 3:12:58 AM


Files produced by the author(s)


  • HAL Id : hal-02548917, version 1



Alain Celisse, Martin Wahl. Analyzing the discrepancy principle for kernelized spectral filter learning algorithms. 2020. ⟨hal-02548917⟩



Record views


Files downloads