Large Deviation Principle for a Markov Chain with a Countable State Space
Abstract
Let E be a denumerable state space, X be an homogeneous Markov chain on E with kernel P. Then the chain X verifies a weak Sanov's theorem, i.e. a weak large deviation principle holds for the law of the pair empirical measure. In our opinion this is an improvement with respect to the existing literature since LDP in the Markov case requires in general, either E to be finite, or strong uniformity conditions, which important classes of chains do not verify, e.g. bounded jump networks. Moreover this LDP holds for any discrete state space Markov chain including non-ergodic chains.