Sparse Gaussian processes for solving nonlinear PDEs

被引:8
|
作者
Meng, Rui [1 ]
Yang, Xianjin [2 ,3 ]
机构
[1] Lawrence Berkeley Natl Lab, Berkeley, CA USA
[2] Tsinghua Univ, Yau Math Sci Ctr, Haidian Dist, Beijing 100084, Peoples R China
[3] Beijing Inst Math Sci & Applicat, Beijing 101408, Peoples R China
关键词
Partial differential equations; Sparse Gaussian process; MEAN-FIELD GAMES; LEARNING FRAMEWORK;
D O I
10.1016/j.jcp.2023.112340
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
This article proposes an efficient numerical method for solving nonlinear partial differential equations (PDEs) based on sparse Gaussian processes (SGPs). Gaussian processes (GPs) have been extensively studied for solving PDEs by formulating the problem of finding a reproducing kernel Hilbert space (RKHS) to approximate a PDE solution. The approximated solution lies in the span of base functions generated by evaluating derivatives of different orders of kernels at sample points. However, the RKHS specified by GPs can result in an expensive computational burden due to the cubic computation order of the matrix inverse. We conjecture that a solution exists on a "condensed" subspace that can achieve similar approximation performance, and we propose a SGP-based method to reformulate the optimization problem in the "condensed" subspace. This significantly reduces the computation burden while retaining desirable accuracy. The paper rigorously formulates this problem and provides error analysis and numerical experiments to demonstrate the effectiveness of this method. The numerical experiments show that the SGP method uses fewer than half the uniform samples as inducing points and achieves comparable accuracy to the GP method using the same number of uniform samples, resulting in a significant reduction in computational cost. Our contributions include formulating the nonlinear PDE problem as an optimization problem on a "condensed" subspace of RKHS using SGP, as well as providing an existence proof and rigorous error analysis. Furthermore, our method can be viewed as an extension of the GP method to account for general positive semi-definite kernels. (c) 2023 Elsevier Inc. All rights reserved.
引用
收藏
页数:26
相关论文
共 50 条
  • [1] SPARSE CHOLESKY FACTORIZATION FOR SOLVING NONLINEAR PDES VIA GAUSSIAN PROCESSES
    Chen, Yifan
    Owhadi, Houman
    Schafer, Florian
    MATHEMATICS OF COMPUTATION, 2024,
  • [2] Solving and learning nonlinear PDEs with Gaussian processes
    Chen, Yifan
    Hosseini, Bamdad
    Owhadi, Houman
    Stuart, Andrew M.
    JOURNAL OF COMPUTATIONAL PHYSICS, 2021, 447
  • [3] Sparse Multimodal Gaussian Processes
    Liu, Qiuyang
    Sun, Shiliang
    INTELLIGENCE SCIENCE AND BIG DATA ENGINEERING, ISCIDE 2017, 2017, 10559 : 28 - 40
  • [4] Federated Sparse Gaussian Processes
    Guo, Xiangyang
    Wu, Daqing
    Ma, Jinwen
    INTELLIGENT COMPUTING METHODOLOGIES, PT III, 2022, 13395 : 267 - 276
  • [5] Solving large classes of nonlinear systems of PDEs
    Anguelov, Roumen
    Rosinger, Elemer E.
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2007, 53 (3-4) : 491 - 507
  • [6] Multiscale empirical interpolation for solving nonlinear PDEs
    Calo, Victor M.
    Efendiev, Yalchin
    Galvisd, Juan
    Ghommem, Mehdi
    JOURNAL OF COMPUTATIONAL PHYSICS, 2014, 278 : 204 - 220
  • [7] Sparse on-line Gaussian processes
    Csató, L
    Opper, M
    NEURAL COMPUTATION, 2002, 14 (03) : 641 - 668
  • [8] Doubly Sparse Variational Gaussian Processes
    Adam, Vincent
    Eleftheriadis, Stefanos
    Durrande, Nicolas
    Artemev, Artem
    Hensman, James
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 2874 - 2883
  • [9] MCMC for Variationally Sparse Gaussian Processes
    Hensman, James
    Matthews, Alexander G. de G.
    Filippone, Maurizio
    Ghahramani, Zoubin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [10] Input Dependent Sparse Gaussian Processes
    Jafrasteh, Bahram
    Villacampa-Calvo, Carlos
    Hernandez-Lobato, Daniel
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,