Absorbing Markov Decision Processes - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Article Dans Une Revue ESAIM: Control, Optimisation and Calculus of Variations Année : 2024

Absorbing Markov Decision Processes

Résumé

In this paper, we study discrete-time absorbing Markov Decision Processes (MDP) with measurable state space and Borel action space with a given initial distribution. For such models, solutions to the characteristic equation that are not occupation measures may exist. Several necessary and sufficient conditions are provided to guarantee that any solution to the characteristic equation is an occupation measure. Under the so-called continuity-compactness conditions, we first show that a measure is precisely an occupation measure if and only if it satisfies the characteristic equation and an additional absolute continuity condition. Secondly, it is shown that the set of occupation measures is compact in the weak-strong topology if and only if the model is uniformly absorbing. Several examples are provided to illustrate our results.
Fichier non déposé

Dates et versions

hal-04377071 , version 1 (07-01-2024)

Licence

Paternité

Identifiants

  • HAL Id : hal-04377071 , version 1

Citer

François Dufour, Tomás Prieto-Rumeau. Absorbing Markov Decision Processes. ESAIM: Control, Optimisation and Calculus of Variations, In press. ⟨hal-04377071⟩
28 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More