# PAC-Bayes Un-Expected Bernstein Inequality

7 MODAL - MOdel for Data Analysis and Learning
LPP - Laboratoire Paul Painlevé - UMR 8524, Université de Lille, Sciences et Technologies, Inria Lille - Nord Europe, METRICS - Evaluation des technologies de santé et des pratiques médicales - ULR 2694, Polytech Lille - École polytechnique universitaire de Lille
Abstract : We present a new PAC-Bayesian generalization bound. Standard bounds contain a $\sqrt{L_n \cdot \KL/n}$ complexity term which dominates unless $L_n$, the empirical error of the learning algorithm's randomized predictions, vanishes. We manage to replace $L_n$ by a term which vanishes in many more situations, essentially whenever the employed learning algorithm is sufficiently stable on the dataset at hand. Our new bound consistently beats state-of-the-art bounds both on a toy example and on UCI datasets (with large enough $n$). Theoretically, unlike existing bounds, our new bound can be expected to converge to $0$ faster whenever a Bernstein/Tsybakov condition holds, thus connecting PAC-Bayesian generalization and {\em excess risk\/} bounds---for the latter it has long been known that faster convergence can be obtained under Bernstein conditions. Our main technical tool is a new concentration inequality which is like Bernstein's but with $X^2$ taken outside its expectation.
Document type :
Conference papers
Domain :
Complete list of metadata

Cited literature [41 references]

https://hal.inria.fr/hal-02401295
Contributor : Benjamin Guedj <>
Submitted on : Monday, December 9, 2019 - 9:20:48 PM
Last modification on : Friday, February 5, 2021 - 3:29:24 AM
Long-term archiving on: : Tuesday, March 10, 2020 - 9:39:30 PM

### File

1905.13367.pdf
Files produced by the author(s)

### Identifiers

• HAL Id : hal-02401295, version 1
• ARXIV : 1905.13367

### Citation

Zakaria Mhammedi, Peter Grünwald, Benjamin Guedj. PAC-Bayes Un-Expected Bernstein Inequality. NeurIPS 2019, Dec 2019, Vancouver, Canada. ⟨hal-02401295⟩

Record views