HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

PAC-Bayesian Bounds based on the Rényi Divergence

Luc Bégin 1 Pascal Germain 2 François Laviolette 3 Jean-Francis Roy 3
2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique - ENS Paris, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows to divide the proof in four successive inequalities, easing the "customization" of PAC-Bayesian theorems. We also propose a family of PAC-Bayesian bounds based on the Rényi divergence between the prior and posterior distributions, whereas most PAC-Bayesian bounds are based on the Kullback-Leibler divergence. Finally, we present an empirical evaluation of the tightness of each inequality of the simplified proof, for both the classical PAC-Bayesian bounds and those based on the Rényi divergence.
Complete list of metadata

Cited literature [31 references]  Display  Hide  Download

Contributor : Pascal Germain Connect in order to contact the contributor
Submitted on : Thursday, October 20, 2016 - 3:07:50 PM
Last modification on : Thursday, March 17, 2022 - 10:08:52 AM


Files produced by the author(s)


  • HAL Id : hal-01384783, version 1



Luc Bégin, Pascal Germain, François Laviolette, Jean-Francis Roy. PAC-Bayesian Bounds based on the Rényi Divergence. International Conference on Artificial Intelligence and Statistics (AISTATS 2016), May 2016, Cadiz, Spain. ⟨hal-01384783⟩



Record views


Files downloads