Skip to Main content Skip to Navigation
Conference papers

Negative Dependence Tightens Variational Bounds

Pierre-Alexandre Mattei 1, 2 Jes Frellsen 3
1 MAASAI - Modèles et algorithmes pour l’intelligence artificielle
CRISAM - Inria Sophia Antipolis - Méditerranée , Laboratoire I3S - SPARKS - Scalable and Pervasive softwARe and Knowledge Systems, UNS - Université Nice Sophia Antipolis (... - 2019), JAD - Laboratoire Jean Alexandre Dieudonné
Abstract : Importance weighted variational inference (IWVI) is a promising strategy for learning latent variable models. IWVI uses new variational bounds, known as Monte Carlo objectives (MCOs), obtained by replacing intractable integrals by Monte Carlo estimates-usually simply obtained via importance sampling. Burda et al. (2016) showed that increasing the number of importance samples provably tightens the gap between the bound and the likelihood. We show that, in a somewhat similar fashion, increasing the negative dependence of importance weights monotonically increases the bound. To this end, we use the supermodular order as a measure of dependence. Our simple result provides theoretical support to several different approaches that leveraged negative dependence to perform efficient variational inference of deep generative models.
Complete list of metadatas

https://hal.inria.fr/hal-03044115
Contributor : Pierre-Alexandre Mattei <>
Submitted on : Monday, December 7, 2020 - 4:41:54 PM
Last modification on : Wednesday, December 9, 2020 - 10:22:15 AM

File

mono_icml_workshop (1).pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03044115, version 1

Collections

Citation

Pierre-Alexandre Mattei, Jes Frellsen. Negative Dependence Tightens Variational Bounds. ICML 2020 - 2nd Workshop on Negative Dependence and Submodularity for ML, Jul 2020, Vienne / Online, Austria. ⟨hal-03044115⟩

Share

Metrics

Record views

16

Files downloads

19