Bayesian Likelihood Free Inference using Mixtures of Experts - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Preprints, Working Papers, ... Year : 2024

Bayesian Likelihood Free Inference using Mixtures of Experts

Abstract

We extend Bayesian Synthetic Likelihood (BSL) methods to non-Gaussian approximations of the likelihood function. In this setting, we introduce Mixtures of Experts (MoEs), a class of neural network models, as surrogate likelihoods that exhibit desirable approximation theoretic properties. Moreover, MoEs can be estimated using Expectation-Maximization algorithm-based approaches, such as the Gaussian Locally Linear Mapping model estimators that we implement. Further, we provide theoretical evidence towards the ability of our procedure to estimate and approximate a wide range of likelihood functions. Through simulations, we demonstrate the superiority of our approach over existing BSL variants in terms of both posterior approximation accuracy and computational efficiency.
Fichier principal
Vignette du fichier
soumis-main0.pdf (1.31 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-04436187 , version 1 (03-02-2024)

Licence

Attribution

Identifiers

  • HAL Id : hal-04436187 , version 1

Cite

Florence Forbes, Hien Duy Nguyen, Trungtin Nguyen. Bayesian Likelihood Free Inference using Mixtures of Experts. 2024. ⟨hal-04436187⟩
72 View
64 Download

Share

Gmail Facebook X LinkedIn More