Self-Scalable Tanh (Stan): Multi-Scale Solutions for Physics-Informed Neural Networks

被引:0
|
作者
Gnanasambandam, Raghav [1 ]
Shen, Bo [2 ]
Chung, Jihoon [1 ]
Yue, Xubo [3 ]
Kong, Zhenyu [1 ]
机构
[1] Virginia Tech, Ind & Syst Engn, Blacksburg, VA 24061 USA
[2] New Jersey Inst Technol, Dept Mech & Ind Engn, Newark, NJ 07102 USA
[3] Northeastern Univ, Dept Mech & Ind Engn, Boston, MA 02115 USA
关键词
Differential equations; Training; Mathematical models; Artificial neural networks; Neural networks; Inverse problems; Standards; Activation function; physics-informed neural networks; differential equations; inverse problem; DEEP LEARNING FRAMEWORK;
D O I
10.1109/TPAMI.2023.3307688
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Differential equations are fundamental in modeling numerous physical systems, including thermal, manufacturing, and meteorological systems. Traditionally, numerical methods often approximate the solutions of complex systems modeled by differential equations. With the advent of modern deep learning, Physics-informed Neural Networks (PINNs) are evolving as a new paradigm for solving differential equations with a pseudo-closed form solution. Unlike numerical methods, the PINNs can solve the differential equations mesh-free, integrate the experimental data, and resolve challenging inverse problems. However, one of the limitations of PINNs is the poor training caused by using the activation functions designed typically for purely data-driven problems. This work proposes a scalable tanh-based activation function for PINNs to improve learning the solutions of differential equations. The proposed Self-scalable tanh (Stan) function is smooth, non-saturating, and has a trainable parameter. It can allow an easy flow of gradients and enable systematic scaling of the input-output mapping during training. Various forward problems to solve differential equations and inverse problems to find the parameters of differential equations demonstrate that the Stan activation function can achieve better training and more accurate predictions than the existing activation functions for PINN in the literature.
引用
收藏
页码:15588 / 15603
页数:16
相关论文
共 50 条
  • [1] Multi-scale graph neural network for physics-informed fluid simulation
    Wei, Lan
    Freris, Nikolaos M.
    VISUAL COMPUTER, 2024,
  • [2] Scalable algorithms for physics-informed neural and graph networks
    Shukla, Khemraj
    Xu, Mengjia
    Trask, Nathaniel
    Karniadakis, George E.
    Data-Centric Engineering, 2022, 3 (06):
  • [3] Scalable algorithms for physics-informed neural and graph networks
    Shukla, Khemraj
    Xu, Mengjia
    Trask, Nathaniel
    Karniadakis, George E.
    DATA-CENTRIC ENGINEERING, 2022, 3
  • [4] Physics-Informed Neural Networks for High-Frequency and Multi-Scale Problems Using Transfer Learning
    Mustajab, Abdul Hannan
    Lyu, Hao
    Rizvi, Zarghaam
    Wuttke, Frank
    APPLIED SCIENCES-BASEL, 2024, 14 (08):
  • [5] On the eigenvector bias of Fourier feature networks: From regression to solving multi-scale PDEs with physics-informed neural networks
    Wang, Sifan
    Wang, Hanwen
    Perdikaris, Paris
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2021, 384
  • [6] Self-adaptive physics-informed neural networks
    McClenny, Levi D.
    Braga-Neto, Ulisses M.
    JOURNAL OF COMPUTATIONAL PHYSICS, 2023, 474
  • [7] Self-Adaptive Physics-Informed Neural Networks
    Texas A&M University, United States
    1600,
  • [8] Enforcing Dirichlet boundary conditions in physics-informed neural networks and variational physics-informed neural networks
    Berrone, S.
    Canuto, C.
    Pintore, M.
    Sukumar, N.
    HELIYON, 2023, 9 (08)
  • [9] Multi-scale modeling in thermal conductivity of Polyurethane incorporated with Phase Change Materials using Physics-Informed Neural Networks
    Liu, Bokai
    Wang, Yizheng
    Rabczuk, Timon
    Olofsson, Thomas
    Lu, Weizhuo
    RENEWABLE ENERGY, 2024, 220
  • [10] Separable Physics-Informed Neural Networks
    Cho, Junwoo
    Nam, Seungtae
    Yang, Hyunmo
    Yun, Seok-Bae
    Hong, Youngjoon
    Park, Eunbyung
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,