Skip to Main content Skip to Navigation
Master thesis

States Reduction on Markov Processes

Alexis Papic 1
1 TOSCA - TO Simulate and CAlibrate stochastic models
CRISAM - Inria Sophia Antipolis - Méditerranée , IECL - Institut Élie Cartan de Lorraine : UMR7502
Abstract : A Markov process is a stochastic process that satisfies the Markov property, in which the future is independent of the past given the present. We first consider a Markov process over the real line with values on a finite set, where the law is defined by exponentially distributed jumps and a transition measure according to which the location of the process at the jump time is chosen; or indistinctly by the generator matrix. We also study Piecewise deterministic Markov processes, a more complex process that consists on two sub-processes: one on a continuous-space and the other on a discrete-space, and together are a Markov process involving a deterministic motion punctuated by random jumps. In the case when there are multiple weakly irreducible classes and the generator matrix can be rewritten as a double scales generator for a small parameter ε, we present a method to approximate the process to a two-scales process: a slow-process on a reduced state space and fast-process inside each new class, and we prove an approximation error of the laws of order ε. We present simulation examples and an application to the sodium channel in the Hodgkin and Huxley model, where we separate the voltage-gates of type h and m into two different time scales.
Document type :
Master thesis
Complete list of metadata

Cited literature [10 references]  Display  Hide  Download

https://hal.inria.fr/hal-01369707
Contributor : Etienne Tanré <>
Submitted on : Wednesday, September 21, 2016 - 2:15:11 PM
Last modification on : Tuesday, May 18, 2021 - 2:32:02 PM
Long-term archiving on: : Thursday, December 22, 2016 - 1:04:54 PM

File

Alexis_Papic_stage_2016.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01369707, version 1

Citation

Alexis Papic. States Reduction on Markov Processes. Probability [math.PR]. 2016. ⟨hal-01369707⟩

Share

Metrics

Record views

325

Files downloads

202