https://hal.inria.fr/hal-01627332Ryabko, DaniilDaniilRyabkoSEQUEL - Sequential Learning - Inria Lille - Nord Europe - Inria - Institut National de Recherche en Informatique et en Automatique - CRIStAL - Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 - Centrale Lille - Université de Lille - CNRS - Centre National de la Recherche ScientifiqueUniversality of Bayesian mixture predictorsHAL CCSD2017[INFO.INFO-LG] Computer Science [cs]/Machine Learning [cs.LG][STAT.TH] Statistics [stat]/Statistics Theory [stat.TH]Ryabko, Daniil2017-11-01 09:43:282023-03-24 14:53:052017-11-01 09:43:28enConference papers1The problem is that of sequential probability forecasting for finite-valued time series. Thedata is generated by an unknown probability distribution over the space of all one-way infinitesequences. It is known that this measure belongs to a given set C, but the latter is completelyarbitrary (uncountably infinite, without any structure given). The performance is measured withasymptotic average log loss. In this work it is shown that the minimax asymptotic performanceis always attainable, and it is attained by a convex combination of a countably many measuresfrom the set C (a Bayesian mixture). This was previously only known for the case when thebest achievable asymptotic error is 0. This also contrasts previous results that show that in thenon-realizable case all Bayesian mixtures may be suboptimal, while there is a predictor thatachieves the optimal performance.