NECESSARY AND SUFFICIENT CONDITIONS FOR OPTIMAL CONTROL OF SEMILINEAR STOCHASTIC PARTIAL DIFFERENTIAL EQUATIONS

被引:0
|
作者
Stannat, Wilhelm [1 ]
Wessels, Lukas [2 ]
机构
[1] Tech Univ Berlin, Inst Math, Berlin, Germany
[2] Georgia Inst Technol, Sch Math, Atlanta, GA USA
来源
ANNALS OF APPLIED PROBABILITY | 2024年 / 34卷 / 03期
关键词
Stochastic maximum principle; Pontryagin maximum principle; dynamic programming; verification theorem; stochastic optimal control; Hamilton-Jacobi-Bellman equation; VERIFICATION THEOREMS; MAXIMUM PRINCIPLE; HJB EQUATIONS; FRAMEWORK;
D O I
10.1214/23-AAP2038
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Using a recently introduced representation of the second order adjoint state as the solution of a function-valued backward stochastic partial differential equation (SPDE), we calculate the viscosity super- and subdifferential of the value function evaluated along an optimal trajectory for controlled semilinear SPDEs. This establishes the well-known connection between Pontryagin's maximum principle and dynamic programming within the framework of viscosity solutions. As a corollary, we derive that the correction term in the stochastic Hamiltonian arising in nonsmooth stochastic control problems is nonpositive. These results directly lead us to a stochastic verification theorem for fully nonlinear Hamilton-Jacobi-Bellman equations in the framework of viscosity solutions.
引用
收藏
页码:3251 / 3287
页数:37
相关论文
共 50 条