Infinite-dimensional optimization and Bayesian nonparametric learning of stochastic differential equations

被引:0
|
作者
Ganguly, Arnab [1 ]
Mitra, Riten [2 ]
Zhou, Jinpu [1 ]
机构
[1] Louisiana State Univ, Dept Math, Baton Rouge, LA 70820 USA
[2] Univ Louisville, Dept Bioinformat & Biostat, Louisville, KY 40202 USA
关键词
Reproducing kernel Hilbert spaces (RKHS); infinite-dimensional optimization; rep-resenter theorem; nonparametric learning; stochastic differential equations; diffusion processes; Bayesian methods; INFERENCE; LIKELIHOOD; REGRESSION; SHRINKAGE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
1 The paper has two major themes. The first part of the paper establishes certain general results for infinite-dimensional optimization problems on Hilbert spaces. These results cover the classical representer theorem and many of its variants as special cases and offer a wider scope of applications. The second part of the paper then develops a systematic approach for learning the drift function of a stochastic differential equation by integrating the results of the first part with Bayesian hierarchical framework. Importantly, our Bayesian approach incorporates low-cost sparse learning through proper use of shrinkage priors while allowing proper quantification of uncertainty through posterior distributions. Several examples at the end illustrate the accuracy of our learning scheme.
引用
收藏
页数:39
相关论文
共 50 条