Variance Reduction for Dependent Sequences with Applications to Stochastic Gradient MCMC - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Article Dans Une Revue SIAM/ASA Journal on Uncertainty Quantification Année : 2021

Variance Reduction for Dependent Sequences with Applications to Stochastic Gradient MCMC

Résumé

In this paper we propose a novel and practical variance reduction approach for additive functionals of dependent sequences. Our approach combines the use of control variates with the minimization of an empirical variance estimate. We analyze finite sample properties of the proposed method and derive finite-time bounds of the excess asymptotic variance to zero. We apply our methodology to stochastic gradient Markov chain Monte Carlo (SGMCMC) methods for Bayesian inference on large data sets and combine it with existing variance reduction methods for SGMCMC. We present empirical results carried out on a number of benchmark examples showing that our variance reduction method achieves significant improvement as compared to state-of-the-art methods at the expense of a moderate increase of computational overhead.

Dates et versions

hal-03529417 , version 1 (17-01-2022)

Identifiants

Citer

Denis Belomestny, Leonid Iosipoi, Eric Moulines, Alexey Naumov, Sergey Samsonov. Variance Reduction for Dependent Sequences with Applications to Stochastic Gradient MCMC. SIAM/ASA Journal on Uncertainty Quantification, 2021, 9, pp.507 - 535. ⟨10.1137/19m1301199⟩. ⟨hal-03529417⟩
16 Consultations
1 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More