Skip to Main content Skip to Navigation
Conference papers

Negative Dependence Tightens Variational Bounds

Pierre-Alexandre Mattei 1, 2 Jes Frellsen 3
1 MAASAI - Modèles et algorithmes pour l’intelligence artificielle
CRISAM - Inria Sophia Antipolis - Méditerranée , UNS - Université Nice Sophia Antipolis (... - 2019), JAD - Laboratoire Jean Alexandre Dieudonné, Laboratoire I3S - SPARKS - Scalable and Pervasive softwARe and Knowledge Systems
Abstract : Importance weighted variational inference (IWVI) is a promising strategy for learning latent variable models. IWVI uses new variational bounds, known as Monte Carlo objectives (MCOs), obtained by replacing intractable integrals by Monte Carlo estimates-usually simply obtained via importance sampling. Burda et al. (2016) showed that increasing the number of importance samples provably tightens the gap between the bound and the likelihood. We show that, in a somewhat similar fashion, increasing the negative dependence of importance weights monotonically increases the bound. To this end, we use the supermodular order as a measure of dependence. Our simple result provides theoretical support to several different approaches that leveraged negative dependence to perform efficient variational inference of deep generative models.
Complete list of metadata

https://hal.inria.fr/hal-03044115
Contributor : Pierre-Alexandre Mattei Connect in order to contact the contributor
Submitted on : Monday, December 7, 2020 - 4:41:54 PM
Last modification on : Friday, January 21, 2022 - 3:09:54 AM
Long-term archiving on: : Monday, March 8, 2021 - 7:25:38 PM

File

mono_icml_workshop (1).pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03044115, version 1

Collections

Citation

Pierre-Alexandre Mattei, Jes Frellsen. Negative Dependence Tightens Variational Bounds. ICML 2020 - 2nd Workshop on Negative Dependence and Submodularity for ML, Jul 2020, Vienne / Online, Austria. ⟨hal-03044115⟩

Share

Metrics

Les métriques sont temporairement indisponibles