Risk filtering and risk-averse control of Markovian systems subject to model uncertainty

We consider a Markov decision process subject to model uncertainty in a Bayesian framework, where we assume that the state process is observed but its law is unknown to the observer. In addition, while the state process and the controls are observed at time t , the actual cost that may depend on the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Mathematical methods of operations research (Heidelberg, Germany) Germany), 2023-10, Vol.98 (2), p.231-268
Hauptverfasser: Bielecki, Tomasz R., Cialenco, Igor, Ruszczyński, Andrzej
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We consider a Markov decision process subject to model uncertainty in a Bayesian framework, where we assume that the state process is observed but its law is unknown to the observer. In addition, while the state process and the controls are observed at time t , the actual cost that may depend on the unknown parameter is not known at time t . The controller optimizes the total cost by using a family of special risk measures, called risk filters, that are appropriately defined to take into account the model uncertainty of the controlled system. These key features lead to non-standard and non-trivial risk-averse control problems, for which we derive the Bellman principle of optimality. We illustrate the general theory on two practical examples: clinical trials and optimal investment.
ISSN:1432-2994
1432-5217
DOI:10.1007/s00186-023-00834-z