STACKED NETWORKS IMPROVE PHYSICS-INFORMED TRAINING: APPLICATIONS TO NEURAL NETWORKS AND DEEP OPERATOR NETWORKS

被引:1
|
作者
Howard, Amanda a. [1 ]
Murphy, Sarah h. [1 ,2 ]
Ahmed, Shady e. [1 ]
Stinis, Panos [1 ,3 ,4 ]
机构
[1] Pacific Northwest Natl Lab, Richland, WA 99354 USA
[2] Univ N Carolina, Charlotte, NC USA
[3] Univ Washington, Dept Appl Math, Seattle, WA 98105 USA
[4] Brown Univ, Div Appl Math, Providence, RI 02812 USA
基金
芬兰科学院;
关键词
Physics-informed neural networks; deep operator networks; physics- informed operator networks; multifidelity; neural networks; FRAMEWORK;
D O I
10.3934/fods.2024029
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Physics -informed neural networks and operator networks have shown promise for effectively solving equations modeling physical systems. However, these networks can be difficult or impossible to train accurately for some systems of equations. We present a novel multifidelity framework for stacking physics -informed neural networks and operator networks that facilitates training. We successively build a chain of networks, where the output at one step can act as a low -fidelity input for training the next step, gradually increasing the expressivity of the learned model. The equations imposed at each step of the iterative process can be the same or different (akin to simulated annealing). The iterative (stacking) nature of the proposed method allows us to progressively learn features of a solution that are hard to learn directly. Through benchmark problems including a nonlinear pendulum, the wave equation, and the viscous Burgers equation, we show how stacking can be used to improve the accuracy and reduce the required size of physics -informed neural networks and operator networks.
引用
收藏
页数:29
相关论文
共 50 条
  • [1] Exactly conservative physics-informed neural networks and deep operator networks for dynamical systems
    Cardoso-Bihlo, Elsa
    Bihlo, Alex
    [J]. Neural Networks, 2025, 181
  • [2] SOBOLEV TRAINING FOR PHYSICS-INFORMED NEURAL NETWORKS
    Son, Hwijae
    Jang, Jin woo
    Han, Woo jin
    Hwang, Hyung ju
    [J]. COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2023, 21 (06) : 1679 - 1705
  • [3] Respecting causality for training physics-informed neural networks
    Wang, Sifan
    Sankaran, Shyam
    Perdikaris, Paris
    [J]. COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2024, 421
  • [4] Enforcing Dirichlet boundary conditions in physics-informed neural networks and variational physics-informed neural networks
    Berrone, S.
    Canuto, C.
    Pintore, M.
    Sukumar, N.
    [J]. HELIYON, 2023, 9 (08)
  • [5] Separable Physics-Informed Neural Networks
    Cho, Junwoo
    Nam, Seungtae
    Yang, Hyunmo
    Yun, Seok-Bae
    Hong, Youngjoon
    Park, Eunbyung
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [6] Quantum Physics-Informed Neural Networks
    Trahan, Corey
    Loveland, Mark
    Dent, Samuel
    [J]. ENTROPY, 2024, 26 (08)
  • [7] Improved Training of Physics-Informed Neural Networks with Model Ensembles
    Haitsiukevich, Katsiaryna
    Ilin, Alexander
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [8] Physics-Informed Deep Neural Networks for Transient Electromagnetic Analysis
    Noakoasteen, Oameed
    Wang, Shu
    Peng, Zhen
    Christodoulou, Christos
    [J]. IEEE OPEN JOURNAL OF ANTENNAS AND PROPAGATION, 2020, 1 (01): : 404 - 412
  • [9] Deep NURBS-admissible physics-informed neural networks
    Saidaoui, Hamed
    Espath, Luis
    Tempone, Raul
    [J]. ENGINEERING WITH COMPUTERS, 2024,
  • [10] Surrogate modeling for radiative heat transfer using physics-informed deep neural operator networks
    Lu, Xiaoyi
    Wang, Yi
    [J]. PROCEEDINGS OF THE COMBUSTION INSTITUTE, 2024, 40 (1-4)