Things Bayes can't do

Daniil Ryabko 1
1 SEQUEL - Sequential Learning
Inria Lille - Nord Europe, CRIStAL - Centre de Recherche en Informatique, Signal et Automatique de Lille (CRIStAL) - UMR 9189
Abstract : The problem of forecasting conditional probabilities of the next event given the past is considered in a general probabilistic setting. Given an arbitrary (large, uncountable) set C of predictors, we would like to construct a single predictor that performs asymptotically as well as the best predictor in C, on any data. Here we show that there are sets C for which such predictors exist, but none of them is a Bayesian predictor with a prior concentrated on C. In other words, there is a predictor with sublinear regret, but every Bayesian predictor must have a linear regret. This negative finding is in sharp contrast with previous results that establish the opposite for the case when one of the predictors in C achieves asymptotically vanishing error. In such a case, if there is a predictor that achieves asymptotically vanishing error for any measure in C, then there is a Bayesian predictor that also has this property, and whose prior is concentrated on (a countable subset of) C.
Complete list of metadatas

https://hal.inria.fr/hal-01380063
Contributor : Daniil Ryabko <>
Submitted on : Wednesday, October 12, 2016 - 2:13:35 PM
Last modification on : Friday, March 22, 2019 - 1:35:46 AM

Links full text

Identifiers

Citation

Daniil Ryabko. Things Bayes can't do. Proceedings of the 27th International Conference on Algorithmic Learning Theory (ALT'16), Oct 2016, Bari, Italy. pp.253-260, ⟨10.1007/978-3-319-46379-7_17⟩. ⟨hal-01380063⟩

Share

Metrics

Record views

173