iPINNs: incremental learning for Physics-informed neural networks

被引:0
|
作者
Dekhovich, Aleksandr [1 ]
Sluiter, Marcel H. F. [1 ]
Tax, David M. J. [2 ]
Bessa, Miguel A. [3 ]
机构
[1] Delft Univ Technol, Dept Mat Sci & Engn, Mekelweg 2, NL-2628 CD Delft, Netherlands
[2] Delft Univ Technol, Pattern Recognit & Bioinformat Lab, Mourik Broekmanweg 6, NL-2628 XE Delft, Netherlands
[3] Brown Univ, Sch Engn, 184 Hope St, Providence, RI 02912 USA
关键词
Physic-informed neural networks (PINNs); Scientific machine learning (SciML); Incremental learning; Sparsity; FRAMEWORK;
D O I
10.1007/s00366-024-02010-1
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs). However, finding a set of neural network parameters that fulfill a PDE at the boundary and within the domain of interest can be challenging and non-unique due to the complexity of the loss landscape that needs to be traversed. Although a variety of multi-task learning and transfer learning approaches have been proposed to overcome these issues, no incremental training procedure has been proposed for PINNs. As demonstrated herein, by developing incremental PINNs (iPINNs) we can effectively mitigate such training challenges and learn multiple tasks (equations) sequentially without additional parameters for new tasks. Interestingly, we show that this also improves performance for every equation in the sequence. Our approach learns multiple PDEs starting from the simplest one by creating its own subnetwork for each PDE and allowing each subnetwork to overlap with previously learned subnetworks. We demonstrate that previous subnetworks are a good initialization for a new equation if PDEs share similarities. We also show that iPINNs achieve lower prediction error than regular PINNs for two different scenarios: (1) learning a family of equations (e.g., 1-D convection PDE); and (2) learning PDEs resulting from a combination of processes (e.g., 1-D reaction-diffusion PDE). The ability to learn all problems with a single network together with learning more complex PDEs with better generalization than regular PINNs will open new avenues in this field.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Numerical analysis of physics-informed neural networks and related models in physics-informed machine learning
    De Ryck, Tim
    Mishra, Siddhartha
    [J]. ACTA NUMERICA, 2024, 33 : 633 - 713
  • [2] Learning in sinusoidal spaces with physics-informed neural networks
    [J]. Wong, Jian Cheng (wongj@ihpc.a-star.edu.sg), 1600, Institute of Electrical and Electronics Engineers Inc. (05):
  • [3] Enforcing Dirichlet boundary conditions in physics-informed neural networks and variational physics-informed neural networks
    Berrone, S.
    Canuto, C.
    Pintore, M.
    Sukumar, N.
    [J]. HELIYON, 2023, 9 (08)
  • [4] Learning Specialized Activation Functions for Physics-Informed Neural Networks
    Wang, Honghui
    Lu, Lu
    Song, Shiji
    Huang, Gao
    [J]. COMMUNICATIONS IN COMPUTATIONAL PHYSICS, 2023, 34 (04) : 869 - 906
  • [5] Physics-informed neural networks for learning fluid flows with symmetry
    Kim, Younghyeon
    Kwak, Hyungyeol
    Nam, Jaewook
    [J]. KOREAN JOURNAL OF CHEMICAL ENGINEERING, 2023, 40 (09) : 2119 - 2127
  • [7] Separable Physics-Informed Neural Networks
    Cho, Junwoo
    Nam, Seungtae
    Yang, Hyunmo
    Yun, Seok-Bae
    Hong, Youngjoon
    Park, Eunbyung
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [8] Quantum Physics-Informed Neural Networks
    Trahan, Corey
    Loveland, Mark
    Dent, Samuel
    [J]. ENTROPY, 2024, 26 (08)
  • [9] Physics-informed neural networks for learning fluid flows with symmetry
    Younghyeon Kim
    Hyungyeol Kwak
    Jaewook Nam
    [J]. Korean Journal of Chemical Engineering, 2023, 40 : 2119 - 2127
  • [10] Learning Free-Surface Flow with Physics-Informed Neural Networks
    Leiteritz, Raphael
    Hurler, Marcel
    Pflueger, Dirk
    [J]. 20TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2021), 2021, : 1668 - 1673