Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Differentiable PAC-Bayes Objectives with Partially Aggregated Neural Networks

Felix Biggs 1, 2 Benjamin Guedj 3, 1, 4, 5, 2
5 MODAL - MOdel for Data Analysis and Learning
LPP - Laboratoire Paul Painlevé - UMR 8524, Université de Lille, Sciences et Technologies, Inria Lille - Nord Europe, METRICS - Evaluation des technologies de santé et des pratiques médicales - ULR 2694, Polytech Lille - École polytechnique universitaire de Lille
Abstract : We make three related contributions motivated by the challenge of training stochastic neural networks, particularly in a PAC-Bayesian setting: (1) we show how averaging over an ensemble of stochastic neural networks enables a new class of \emph{partially-aggregated} estimators; (2) we show that these lead to provably lower-variance gradient estimates for non-differentiable signed-output networks; (3) we reformulate a PAC-Bayesian bound for these networks to derive a directly optimisable, differentiable objective and a generalisation guarantee, without using a surrogate loss or loosening the bound. This bound is twice as tight as that of Letarte et al. (2019) on a similar network type. We show empirically that these innovations make training easier and lead to competitive guarantees.
Document type :
Preprints, Working Papers, ...
Complete list of metadata

Cited literature [17 references]  Display  Hide  Download
Contributor : Benjamin Guedj <>
Submitted on : Tuesday, June 23, 2020 - 4:00:59 PM
Last modification on : Friday, February 5, 2021 - 3:29:24 AM


Files produced by the author(s)


  • HAL Id : hal-02879216, version 1
  • ARXIV : 2006.12228



Felix Biggs, Benjamin Guedj. Differentiable PAC-Bayes Objectives with Partially Aggregated Neural Networks. 2020. ⟨hal-02879216⟩



Record views


Files downloads