Variable separated physics-informed neural networks based on adaptive weighted loss functions for blood flow model

被引:0
|
作者
Liu, Youqiong [1 ,2 ]
Cai, Li [1 ,3 ,4 ]
Chen, Yaping [1 ,3 ,5 ]
Ma, Pengfei [1 ,3 ,4 ]
Zhong, Qian [1 ,3 ,4 ]
机构
[1] Northwestern Polytech Univ, Sch Math & Stat, Xian 710129, Peoples R China
[2] Xinyang Normal Univ, Sch Math & Stat, Xinyang 464000, Peoples R China
[3] NPU, UoG Int Cooperat Lab Computat & Applicat Cardiol, Xian 710072, Shaanxi, Peoples R China
[4] Xian Key Lab Sci Computat & Appl Stat, Xian 710129, Peoples R China
[5] Hong Kong Polytech Univ, Dept Appl Math, Hung Hom, Kowloon, Hong Kong, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Blood flow model; Physics-informed neural networks; The minmax algorithm; Adaptive weighted optimizer; NAVIER-STOKES EQUATIONS; WALL SHEAR; NUMERICAL SCHEMES; FLUID-MECHANICS; ARTERIAL-WALL; FRAMEWORK; VELOCITY; VESSELS; STRESS; PLAQUE;
D O I
10.1016/j.camwa.2023.11.018
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Physics-informed neural networks (PINN) architectures have been recently explored to accelerate hemodynamics simulations by leveraging mathematical models for blood flow and empirical data. In this paper, a variable separated physics-informed neural networks based on adaptive weighted loss functions (AW-vsPINN) is developed for blood flow model in arteries. In particular, sub-neural networks are proposed to separately predict the unknown scalar state variables by sharing the same input layer. The AW-vsPINN adaptively adjusts the weights of loss terms by the minmax algorithm, which will be updated synchronously along with the network parameters and can balance the contributions of different loss terms during training. The two-stage optimization is implemented to train the neural networks. Specifically, the Adam optimizer is iterated for initial steps with the learning rate generated by the inverse time decay scheduler, and then the L-BFGS optimizer continues to train until the loss converges. Numerical results illustrate that the AW-vsPINN can remarkably improve prediction accuracy and enhance the ability of generalization compared to the conventional PINN. The proposed AW-vsPINN framework has high potential in predicting the blood flow information in cardiovascular disease.
引用
收藏
页码:108 / 122
页数:15
相关论文
共 50 条
  • [1] Physics-informed neural networks based on adaptive weighted loss functions for Hamilton-Jacobi equations
    Liu, Youqiong
    Cai, Li
    Chen, Yaping
    Wang, Bin
    [J]. MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2022, 19 (12) : 12866 - 12896
  • [2] Physics-informed neural networks based cascade loss model
    Feng Y.
    Song X.
    Yuan W.
    Lu H.
    [J]. Hangkong Dongli Xuebao/Journal of Aerospace Power, 2023, 38 (07): : 845 - 855
  • [3] Self-adaptive loss balanced Physics-informed neural networks
    Xiang, Zixue
    Peng, Wei
    Liu, Xu
    Yao, Wen
    [J]. NEUROCOMPUTING, 2022, 496 : 11 - 34
  • [4] The Role of Adaptive Activation Functions in Fractional Physics-Informed Neural Networks
    Coelho, C.
    Costa, M. Fernanda P.
    Ferras, L. L.
    [J]. INTERNATIONAL CONFERENCE ON NUMERICAL ANALYSIS AND APPLIED MATHEMATICS 2022, ICNAAM-2022, 2024, 3094
  • [5] Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
    Jagtap, Ameya D.
    Kawaguchi, Kenji
    Karniadakis, George Em
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 404 (404)
  • [7] Self-adaptive physics-informed neural networks
    McClenny, Levi D.
    Braga-Neto, Ulisses M.
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2023, 474
  • [8] Loss-attentional physics-informed neural networks
    Song, Yanjie
    Wang, He
    Yang, He
    Taccari, Maria Luisa
    Chen, Xiaohui
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2024, 501
  • [9] Temporal consistency loss for physics-informed neural networks
    Thakur, Sukirt
    Raissi, Maziar
    Mitra, Harsa
    Ardekani, Arezoo M.
    [J]. PHYSICS OF FLUIDS, 2024, 36 (07)
  • [10] Self-Adaptive Physics-Informed Neural Networks
    Texas A&M University, United States
    [J]. 1600,