Flow field reconstruction from sparse sensor measurements with physics-informed neural networks

被引:3
|
作者
Hosseini, Mohammad Yasin [1 ]
Shiri, Yousef [1 ]
机构
[1] Shahrood Univ Technol, Fac Min Petr & Geophys Engn, Shahrood, Iran
关键词
D O I
10.1063/5.0211680
中图分类号
O3 [力学];
学科分类号
08 ; 0801 ;
摘要
In the realm of experimental fluid mechanics, accurately reconstructing high-resolution flow fields is notably challenging due to often sparse and incomplete data across time and space domains. This is exacerbated by the limitations of current experimental tools and methods, which leave critical areas without measurable data. This research suggests a feasible solution to this problem by employing an inverse physics-informed neural network (PINN) to merge available sparse data with physical laws. The method's efficacy is demonstrated using flow around a cylinder as a case study, with three distinct training sets. One was the sparse velocity data from a domain, and the other two datasets were limited velocity data obtained from the domain boundaries and sensors around the cylinder wall. The coefficient of determination (R-2) coefficient and mean squared error (RMSE) metrics, indicative of model performance, have been determined for the velocity components of all models. For the 28 sensors model, the R-2 value stands at 0.996 with an associated RMSE of 0.0251 for the u component, while for the v component, the R-2 value registers at 0.969, accompanied by an RMSE of 0.0169. The outcomes indicate that the method can successfully recreate the actual velocity field with considerable precision with more than 28 sensors around the cylinder, highlighting PINN's potential as an effective data assimilation technique for experimental fluid mechanics.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Reconstruction of the turbulent flow field with sparse measurements using physics-informed neural network
    Chaurasia, Nagendra Kumar
    Chakraborty, Shubhankar
    PHYSICS OF FLUIDS, 2024, 36 (08)
  • [2] Sparse wavefield reconstruction based on Physics-Informed neural networks
    Xu, Bin
    Zou, Yun
    Sha, Gaofeng
    Yang, Liang
    Cai, Guixi
    Li, Yang
    ULTRASONICS, 2025, 149
  • [3] Flow reconstruction with uncertainty quantification from noisy measurements based on Bayesian physics-informed neural networks
    Liu, Hailong
    Wang, Zhi
    Deng, Rui
    Wang, Shipeng
    Meng, Xuhui
    Xu, Chao
    Cai, Shengze
    PHYSICS OF FLUIDS, 2024, 36 (11)
  • [4] High-resolution reconstruction of turbulent flames from sparse data with physics-informed neural networks
    Liu, Shiyu
    Wang, Haiou
    Chen, Jacqueline H.
    Luo, Kun
    Fan, Jianren
    COMBUSTION AND FLAME, 2024, 260
  • [5] RF-PINNs: Reactive flow physics-informed neural networks for field reconstruction of laminar and turbulent flames using sparse data
    Yadav, Vikas
    Casel, Mario
    Ghani, Abdulla
    JOURNAL OF COMPUTATIONAL PHYSICS, 2025, 524
  • [6] Reconstruction of Ship Propeller Wake Field Based on Physics-Informed Neural Networks
    Hou, Xianrui
    Zhou, Xingyu
    Huang, Xiaocheng
    Shanghai Jiaotong Daxue Xuebao/Journal of Shanghai Jiaotong University, 2024, 58 (11): : 1654 - 1664
  • [7] Mean flow reconstruction of unsteady flows using physics-informed neural networks
    Sliwinski, Lukasz
    Rigas, Georgios
    DATA-CENTRIC ENGINEERING, 2023, 4 (01):
  • [8] Uncovering near-wall blood flow from sparse data with physics-informed neural networks
    Arzani, Amirhossein
    Wang, Jian-Xun
    D'Souza, Roshan M.
    PHYSICS OF FLUIDS, 2021, 33 (07)
  • [9] Investigation of Physics-Informed Neural Networks to Reconstruct a Flow Field with High Resolution
    Yang, Zhou
    Xu, Yuwang
    Jing, Jionglin
    Fu, Xuepeng
    Wang, Bofu
    Ren, Haojie
    Zhang, Mengmeng
    Sun, Tongxiao
    JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2023, 11 (11)
  • [10] Reconstruction of dynamic wind turbine wake flow fields from virtual Lidar measurements via physics-informed neural networks
    Zhang, Jincheng
    Zhao, Xiaowei
    SCIENCE OF MAKING TORQUE FROM WIND, TORQUE 2024, 2024, 2767