Residual-based attention in physics-informed neural networks

被引:26
|
作者
Anagnostopoulos, Sokratis J. [1 ]
Toscano, Juan Diego [2 ]
Stergiopulos, Nikolaos [1 ]
Karniadakis, George Em [2 ,3 ]
机构
[1] Ecole Polytech Fed Lausanne, Lab Hemodynam & Cardiovasc Technol, ,VD, CH-1015 Lausanne, Switzerland
[2] Brown Univ, Div Appl Math, Providence, RI 02912 USA
[3] Brown Univ, Sch Engn, Providence, RI 02912 USA
基金
瑞士国家科学基金会;
关键词
Residual-based attention; PINNs accuracy; Adaptive weights; Fast convergence;
D O I
10.1016/j.cma.2024.116805
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Driven by the need for more efficient and seamless integration of physical models and data, physics -informed neural networks (PINNs) have seen a surge of interest in recent years. However, ensuring the reliability of their convergence and accuracy remains a challenge. In this work, we propose an efficient, gradient -less weighting scheme for PINNs that accelerates the convergence of dynamic or static systems. This simple yet effective attention mechanism is a bounded function of the evolving cumulative residuals and aims to make the optimizer aware of problematic regions at no extra computational cost or adversarial learning. We illustrate that this general method consistently achieves one order of magnitude faster convergence than vanilla PINNs and a minimum relative L2 error of O(10-5), on typical benchmarks of the literature. The method is further tested on the inverse solution of the Navier-Stokes within the brain perivascular spaces, where it considerably improves the prediction accuracy. Furthermore, an ablation study is performed for each case to identify the contribution of the components that enhance the vanilla PINN formulation. Evident from the convergence trajectories is the ability of the optimizer to effectively escape from poor local minima or saddle points while focusing on the challenging domain regions, which consistently have a high residual score. We believe that alongside exact boundary conditions and other model reparameterizations, this type of attention mask could be an essential element for fast training of both PINNs and neural operators.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] PINNProv: Provenance for Physics-Informed Neural Networks
    de Oliveira, Lyncoln S.
    Kunstmann, Liliane
    Pina, Debora
    de Oliveira, Daniel
    Mattoso, Marta
    2023 INTERNATIONAL SYMPOSIUM ON COMPUTER ARCHITECTURE AND HIGH PERFORMANCE COMPUTING WORKSHOPS, SBAC-PADW, 2023, : 16 - 23
  • [22] Physics-Informed Neural Networks for Power Systems
    Misyris, George S.
    Venzke, Andreas
    Chatzivasileiadis, Spyros
    2020 IEEE POWER & ENERGY SOCIETY GENERAL MEETING (PESGM), 2020,
  • [23] On physics-informed neural networks for quantum computers
    Markidis, Stefano
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2022, 8
  • [24] Physics-Informed Neural Networks for shell structures
    Bastek, Jan-Hendrik
    Kochmann, Dennis M.
    EUROPEAN JOURNAL OF MECHANICS A-SOLIDS, 2023, 97
  • [25] fPINNs: FRACTIONAL PHYSICS-INFORMED NEURAL NETWORKS
    Pang, Guofei
    Lu, Lu
    Karniadakis, George E. M.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2019, 41 (04): : A2603 - A2626
  • [26] Numerical analysis of physics-informed neural networks and related models in physics-informed machine learning
    De Ryck, Tim
    Mishra, Siddhartha
    ACTA NUMERICA, 2024, 33 : 633 - 713
  • [27] On acoustic fields of complex scatters based on physics-informed neural networks
    Wang, Hao
    Li, Jian
    Wang, Linfeng
    Liang, Lin
    Zeng, Zhoumo
    Liu, Yang
    ULTRASONICS, 2023, 128
  • [28] Mean flow data assimilation based on physics-informed neural networks
    von Saldern, Jakob G. R.
    Reumschuessel, Johann Moritz
    Kaiser, Thomas L.
    Sieber, Moritz
    Oberleithner, Kilian
    PHYSICS OF FLUIDS, 2022, 34 (11)
  • [29] Stochastic Memristor Modeling Framework Based on Physics-Informed Neural Networks
    Kim, Kyeongmin
    Lee, Jonghwan
    APPLIED SCIENCES-BASEL, 2024, 14 (20):
  • [30] Predicting transformer temperature field based on physics-informed neural networks
    Tang, Pengfei
    Zhang, Zhonghao
    Tong, Jie
    Long, Tianhang
    Huang, Can
    Qi, Zihao
    HIGH VOLTAGE, 2024, 9 (04) : 839 - 852