Generalization Guarantees via Algorithm-dependent Rademacher Complexity - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Generalization Guarantees via Algorithm-dependent Rademacher Complexity

Résumé

Algorithm- and data-dependent generalization bounds are required to explain the generalization behavior of modern machine learning algorithms. In this context, there exists information theoretic generalization bounds that involve (various forms of) mutual information, as well as bounds based on hypothesis set stability. We propose a conceptually related, but technically distinct complexity measure to control generalization error, which is the empirical Rademacher complexity of an algorithm- and data-dependent hypothesis class. Combining standard properties of Rademacher complexity with the convenient structure of this class, we are able to (i) obtain novel bounds based on the finite fractal dimension, which (a) extend previous fractal dimension-type bounds from continuous to finite hypothesis classes, and (b) avoid a mutual information term that was required in prior work; (ii) we greatly simplify the proof of a recent dimension-independent generalization bound for stochastic gradient descent; and (iii) we easily recover results for VC classes and compression schemes, similar to approaches based on conditional mutual information.

Dates et versions

hal-04478945 , version 1 (27-02-2024)

Licence

Paternité

Identifiants

Citer

Sarah Sachs, Tim van Erven, Liam Hodgkinson, Rajiv Khanna, Umut Şimşekli. Generalization Guarantees via Algorithm-dependent Rademacher Complexity. COLT 2023 - 36th Annual Conference on Learning Theory, 2023, Bangalore (Virtual event), India. ⟨hal-04478945⟩
16 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More