Generalization Bounds using Data-Dependent Fractal Dimensions - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Generalization Bounds using Data-Dependent Fractal Dimensions

Résumé

Providing generalization guarantees for modern neural networks has been a crucial task in statistical learning. Recently, several studies have attempted to analyze the generalization error in such settings by using tools from fractal geometry. While these works have successfully introduced new mathematical tools to apprehend generalization, they heavily rely on a Lipschitz continuity assumption, which in general does not hold for neural networks and might make the bounds vacuous. In this work, we address this issue and prove fractal geometry-based generalization bounds without requiring any Lipschitz assumption. To achieve this goal, we build up on a classical covering argument in learning theory and introduce a data-dependent fractal dimension. Despite introducing a significant amount of technical complications, this new notion lets us control the generalization error (over either fixed or random hypothesis spaces) along with certain mutual information (MI) terms. To provide a clearer interpretation to the newly introduced MI terms, as a next step, we introduce a notion of 'geometric stability' and link our bounds to the prior art. Finally, we make a rigorous connection between the proposed data-dependent dimension and topological data analysis tools, which then enables us to compute the dimension in a numerically efficient way. We support our theory with experiments conducted on various settings.
Fichier principal
Vignette du fichier
data_dependent_fractals.pdf (1.13 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04438550 , version 1 (05-02-2024)

Licence

Paternité

Identifiants

Citer

Benjamin Dupuis, George Deligiannidis, Umut Şimşekli. Generalization Bounds using Data-Dependent Fractal Dimensions. International Conference on Machine Learning (ICML 2023), Jul 2023, Honolulu, United States. ⟨hal-04438550⟩
32 Consultations
11 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More