Hamilton-Jacobi-Bellman Equation for a Time-Optimal Control Problem in the Space of Probability Measures

Abstract : In this paper we formulate a time-optimal control problem in the space of probability measures endowed with the Wasserstein metric as a natural generalization of the correspondent classical problem in $${\mathbb {R}}^d$$ where the controlled dynamics is given by a differential inclusion. The main motivation is to model situations in which we have only a probabilistic knowledge of the initial state. In particular we prove first a Dynamic Programming Principle and then we give an Hamilton-Jacobi-Bellman equation in the space of probability measures which is solved by a generalization of the minimum time function in a suitable viscosity sense.
Document type :
Conference papers
Complete list of metadatas

Cited literature [10 references]  Display  Hide  Download

https://hal.inria.fr/hal-01626919
Contributor : Hal Ifip <>
Submitted on : Tuesday, October 31, 2017 - 2:41:34 PM
Last modification on : Tuesday, October 31, 2017 - 2:44:52 PM
Long-term archiving on : Thursday, February 1, 2018 - 1:44:01 PM

File

447583_1_En_18_Chapter.pdf
Files produced by the author(s)

Licence


Distributed under a Creative Commons Attribution 4.0 International License

Identifiers

Citation

Giulia Cavagnari, Antonio Marigonda, Giandomenico Orlandi. Hamilton-Jacobi-Bellman Equation for a Time-Optimal Control Problem in the Space of Probability Measures. 27th IFIP Conference on System Modeling and Optimization (CSMO), Jun 2015, Sophia Antipolis, France. pp.200-208, ⟨10.1007/978-3-319-55795-3_18⟩. ⟨hal-01626919⟩

Share

Metrics

Record views

60

Files downloads

42