Skip to Main content Skip to Navigation
Conference papers

Time-series information and learning

Daniil Ryabko 1
1 SEQUEL - Sequential Learning
LIFL - Laboratoire d'Informatique Fondamentale de Lille, Inria Lille - Nord Europe, LAGIS - Laboratoire d'Automatique, Génie Informatique et Signal
Abstract : Given a time series $X_1,\dots,X_n,\dots$ taking values in a large (high-dimensional) space $\cX$, we would like to find a function $f$ from $\cX$ to a small (low-dimensional or finite) space $\cY$ such that the time series $f(X_1),\dots,f(X_n),\dots$ retains all the information about the time-series dependence in the original sequence, or as much as possible thereof. This goal is formalized in this work, and it is shown that the target function $f$ can be found as the one that maximizes a certain quantity that can be expressed in terms of entropies of the series $(f(X_i))_{i\in\N}$. This quantity can be estimated empirically, and does not involve estimating the distribution on the original time series $(X_i)_{i\in\N}$.
Document type :
Conference papers
Complete list of metadata
Contributor : Daniil Ryabko Connect in order to contact the contributor
Submitted on : Thursday, May 16, 2013 - 1:55:47 PM
Last modification on : Thursday, January 20, 2022 - 4:16:20 PM


  • HAL Id : hal-00823233, version 1


Daniil Ryabko. Time-series information and learning. ISIT - International Symposium on Information Theory, 2013, Istanbul, Turkey. pp.1392-1395. ⟨hal-00823233⟩



Les métriques sont temporairement indisponibles