# Time-series information and learning

1 SEQUEL - Sequential Learning
LIFL - Laboratoire d'Informatique Fondamentale de Lille, Inria Lille - Nord Europe, LAGIS - Laboratoire d'Automatique, Génie Informatique et Signal
Abstract : Given a time series $X_1,\dots,X_n,\dots$ taking values in a large (high-dimensional) space $\cX$, we would like to find a function $f$ from $\cX$ to a small (low-dimensional or finite) space $\cY$ such that the time series $f(X_1),\dots,f(X_n),\dots$ retains all the information about the time-series dependence in the original sequence, or as much as possible thereof. This goal is formalized in this work, and it is shown that the target function $f$ can be found as the one that maximizes a certain quantity that can be expressed in terms of entropies of the series $(f(X_i))_{i\in\N}$. This quantity can be estimated empirically, and does not involve estimating the distribution on the original time series $(X_i)_{i\in\N}$.
Document type :
Conference papers

https://hal.inria.fr/hal-00823233
Contributor : Daniil Ryabko <>
Submitted on : Thursday, May 16, 2013 - 1:55:47 PM
Last modification on : Thursday, February 21, 2019 - 10:52:49 AM

### Identifiers

• HAL Id : hal-00823233, version 1

### Citation

Daniil Ryabko. Time-series information and learning. ISIT - International Symposium on Information Theory, 2013, Istanbul, Turkey. pp.1392-1395. ⟨hal-00823233⟩

Record views