Learning Specialized Activation Functions for Physics-Informed Neural Networks

被引:2
|
作者
Wang, Honghui [1 ]
Lu, Lu [2 ]
Song, Shiji [1 ]
Huang, Gao [1 ]
机构
[1] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
[2] Yale Univ, Dept Stat & Data Sci, New Haven, CT 06511 USA
基金
中国国家自然科学基金;
关键词
Partial differential equations; deep learning; adaptive activation functions; physics-informed neural networks; ALGORITHM; CONVERGENCE; FRAMEWORK;
D O I
10.4208/cicp.OA-2023-0058
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Physics-informed neural networks (PINNs) are known to suffer from optimization difficulty. In this work, we reveal the connection between the optimization difficulty of PINNs and activation functions. Specifically, we show that PINNs exhibit high sensitivity to activation functions when solving PDEs with distinct properties. Existing works usually choose activation functions by inefficient trial-and-error. To avoid the inefficient manual selection and to alleviate the optimization difficulty of PINNs, we introduce adaptive activation functions to search for the optimal function when solving different problems. We compare different adaptive activation functions and discuss their limitations in the context of PINNs. Furthermore, we propose to tailor the idea of learning combinations of candidate activation functions to the PINNs optimization, which has a higher requirement for the smoothness and diversity on learned functions. This is achieved by removing activation functions which cannot provide higher-order derivatives from the candidate set and incorporating elementary functions with different properties according to our prior knowledge about the PDE at hand. We further enhance the search space with adaptive slopes. The proposed adaptive activation function can be used to solve different PDE systems in an interpretable way. Its effectiveness is demonstrated on a series of benchmarks. Code is available at https://github.com/LeapLabTHU/AdaAFforPINNs.
引用
收藏
页码:869 / 906
页数:38
相关论文
共 50 条
  • [1] The Role of Adaptive Activation Functions in Fractional Physics-Informed Neural Networks
    Coelho, C.
    Costa, M. Fernanda P.
    Ferras, L. L.
    [J]. INTERNATIONAL CONFERENCE ON NUMERICAL ANALYSIS AND APPLIED MATHEMATICS 2022, ICNAAM-2022, 2024, 3094
  • [2] Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
    Jagtap, Ameya D.
    Kawaguchi, Kenji
    Karniadakis, George Em
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 404 (404)
  • [4] Physics-Informed Neural Networks for Cardiac Activation Mapping
    Costabal, Francisco Sahli
    Yang, Yibo
    Perdikaris, Paris
    Hurtado, Daniel E.
    Kuhl, Ellen
    [J]. FRONTIERS IN PHYSICS, 2020, 8
  • [5] Learning of viscosity functions in rarefied gas flows with physics-informed neural networks
    Tucny, Jean-Michel
    Durve, Mihir
    Montessori, Andrea
    Succi, Sauro
    [J]. COMPUTERS & FLUIDS, 2024, 269
  • [6] Numerical analysis of physics-informed neural networks and related models in physics-informed machine learning
    De Ryck, Tim
    Mishra, Siddhartha
    [J]. ACTA NUMERICA, 2024, 33 : 633 - 713
  • [7] iPINNs: incremental learning for Physics-informed neural networks
    Dekhovich, Aleksandr
    Sluiter, Marcel H. F.
    Tax, David M. J.
    Bessa, Miguel A.
    [J]. ENGINEERING WITH COMPUTERS, 2024,
  • [8] Learning in sinusoidal spaces with physics-informed neural networks
    [J]. Wong, Jian Cheng (wongj@ihpc.a-star.edu.sg), 1600, Institute of Electrical and Electronics Engineers Inc. (05):
  • [9] Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks
    Jagtap, Ameya D.
    Kawaguchi, Kenji
    Karniadakis, George Em
    [J]. PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2020, 476 (2239):
  • [10] Enforcing Dirichlet boundary conditions in physics-informed neural networks and variational physics-informed neural networks
    Berrone, S.
    Canuto, C.
    Pintore, M.
    Sukumar, N.
    [J]. HELIYON, 2023, 9 (08)