VERIFICATION THEOREMS FOR STOCHASTIC OPTIMAL CONTROL PROBLEMS IN HILBERT SPACES BY MEANS OF A GENERALIZED DYNKIN FORMULA

Verification theorems are key results to successfully employ the dynamic programming approach to optimal control problems. In this paper, we introduce a new method to prove verification theorems for infinite dimensional stochastic optimal control problems. The method applies in the case of additivel...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Annals of applied probability 2018-12, Vol.28 (6), p.3558-3599
Hauptverfasser: Federico, Salvatore, Gozzi, Fausto
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Verification theorems are key results to successfully employ the dynamic programming approach to optimal control problems. In this paper, we introduce a new method to prove verification theorems for infinite dimensional stochastic optimal control problems. The method applies in the case of additively controlled Ornstein–Uhlenbeck processes, when the associated Hamilton–Jacobi–Bellman (HJB) equation admits a mild solution (in the sense of [J. Differential Equations 262 (2017) 3343–3389]). The main methodological novelty of our result relies on the fact that it is not needed to prove, as in previous literature (see, e.g., [Comm. Partial Differential Equations 20 (1995) 775–826]), that the mild solution is a strong solution, that is, a suitable limit of classical solutions of approximating HJB equations. To achieve the goal, we prove a new type of Dynkin formula, which is the key tool for the proof of our main result.
ISSN:1050-5164
2168-8737
DOI:10.1214/18-AAP1397