Low-Complexity Nonparametric Bayesian Online Prediction with Universal Guarantees - Archive ouverte HAL Access content directly
Conference Papers Year :

Low-Complexity Nonparametric Bayesian Online Prediction with Universal Guarantees

(1) , (1, 2)
1
2
Frédéric Cazals
  • Function : Author
  • PersonId : 905122
Alix Lhéritier

Abstract

We propose a novel nonparametric online predictor for discrete labels conditioned on multivariate continuous features. The predictor is based on a feature space discretization induced by a full-fledged k-d tree with randomly picked directions and a recursive Bayesian distribution, which allows to automatically learn the most relevant feature scales characterizing the conditional distribution. We prove its pointwise universality, i.e., it achieves a normalized log loss performance asymptotically as good as the true conditional entropy of the labels given the features. The time complexity to process the n-th sample point is O(log n) in probability with respect to the distribution generating the data points, whereas other exact nonparametric methods require to process all past observations. Experiments on challenging datasets show the computational and statistical efficiency of our algorithm in comparison to standard and state-of-the-art methods.
Not file

Dates and versions

hal-02425602 , version 1 (30-12-2019)

Identifiers

  • HAL Id : hal-02425602 , version 1

Cite

Frédéric Cazals, Alix Lhéritier. Low-Complexity Nonparametric Bayesian Online Prediction with Universal Guarantees. NeurIPS 2019 - Thirty-third Conference on Neural Information Processing Systems, Dec 2019, Vancouver, Canada. ⟨hal-02425602⟩
38 View
0 Download

Share

Gmail Facebook Twitter LinkedIn More