Skip to Main content Skip to Navigation
Journal articles

Neural Empirical Bayes

Abstract : We unify kernel density estimation and empirical Bayes and address a set of problems in unsupervised machine learning with a geometric interpretation of those methods, rooted in the concentration of measure phenomenon. Kernel density is viewed symbolically as X Y where the random variable X is smoothed to Y = X + N (0, σ 2 I d), and empirical Bayes is the machinery to denoise in a least-squares sense, which we express as X Y. A learning objective is derived by combining these two, symbolically captured by X Y. Crucially, instead of using the original nonparametric estimators, we parametrize the energy function with a neural network denoted by φ; at optimality, ∇φ ≈ −∇ log f where f is the density of Y. The optimization problem is abstracted as interactions of high-dimensional spheres which emerge due to the concentration of isotropic Gaussians. We introduce two algorithmic frameworks based on this machinery: (i) a "walk-jump" sampling scheme that combines Langevin MCMC (walks) and empirical Bayes (jumps), and (ii) a probabilistic framework for associative memory, called NEBULA, definedà la Hopfield by the gradient flow of the learned energy to a set of attractors. We finish the paper by reporting the emergence of very rich "creative memories" as attractors of NEBULA for highly-overlapping spheres.
Complete list of metadatas

Cited literature [41 references]  Display  Hide  Download
Contributor : Aapo Hyvarinen <>
Submitted on : Thursday, December 19, 2019 - 3:01:41 PM
Last modification on : Monday, March 9, 2020 - 5:28:07 PM
Long-term archiving on: : Friday, March 20, 2020 - 6:43:40 PM


Files produced by the author(s)


  • HAL Id : hal-02419496, version 1



Saeed Saremi, Aapo Hyvärinen. Neural Empirical Bayes. Journal of Machine Learning Research, Microtome Publishing, 2019, 20, pp.1 - 23. ⟨hal-02419496⟩



Record views


Files downloads