HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

Meta-Learning with Shared Amortized Variational Inference

Abstract : We propose a novel amortized variational inference scheme for an empirical Bayes meta-learning model, where model parameters are treated as latent variables. We learn the prior distribution over model parameters conditioned on limited training data using a variational autoencoder approach. Our framework proposes sharing the same amor-tized inference network between the conditional prior and variational posterior distributions over the model parameters. While the posterior leverages both the labeled support and query data, the conditional prior is based only on the labeled support data. We show that in earlier work, relying on Monte-Carlo approximation, the conditional prior collapses to a Dirac delta function. In contrast, our variational approach prevents this collapse and preserves uncertainty over the model parameters. We evaluate our approach on the miniImageNet, CIFAR-FS and FC100 datasets, and present results demonstrating its advantages over previous work.
Complete list of metadata

Cited literature [47 references]  Display  Hide  Download

Contributor : Thoth Team Connect in order to contact the contributor
Submitted on : Monday, August 31, 2020 - 4:15:55 AM
Last modification on : Wednesday, May 4, 2022 - 12:18:03 PM
Long-term archiving on: : Tuesday, December 1, 2020 - 12:06:01 PM



  • HAL Id : hal-02925830, version 1
  • ARXIV : 2008.12037



Ekaterina Iakovleva, Jakob Verbeek, Karteek Alahari. Meta-Learning with Shared Amortized Variational Inference. ICML 2020 - 37th International Conference on Machine Learning, Jul 2020, Vienna (Online), Austria. pp.4572-4582. ⟨hal-02925830⟩



Record views


Files downloads