Self-supervised learning based on Transformer for flow reconstruction and prediction

被引:0
|
作者
Xu, Bonan [1 ]
Zhou, Yuanye [2 ]
Bian, Xin [1 ]
机构
[1] Zhejiang Univ, Dept Engn Mech, State Key Lab Fluid Power & Mechatron Syst, Hangzhou 310027, Peoples R China
[2] Baidu Inc, Beijing 100085, Peoples R China
关键词
NEURAL-NETWORKS; DEEP;
D O I
10.1063/5.0188998
中图分类号
O3 [力学];
学科分类号
08 ; 0801 ;
摘要
Machine learning has great potential for efficient reconstruction and prediction of flow fields. However, existing datasets may have highly diversified labels for different flow scenarios, which are not applicable for training a model. To this end, we make a first attempt to apply the self-supervised learning (SSL) technique to fluid dynamics, which disregards data labels for pre-training the model. The SSL technique embraces a large amount of data (8000 snapshots) at Reynolds numbers of Re = 200, 300, 400, and 500 without discriminating between them, which improves the generalization of the model. The Transformer model is pre-trained via a specially designed pretext task, where it reconstructs the complete flow fields after randomly masking 20% data points in each snapshot. For the downstream task of flow reconstruction, the pre-trained model is fine-tuned separately with 256 snapshots for each Reynolds number. The fine-tuned models accurately reconstruct the complete flow fields based on less than 5% random data points within a limited window even for Re = 250 and 600, whose data were not seen in the pre-trained phase. For the other downstream task of flow prediction, the pre-training model is fine-tuned separately with 128 consecutive snapshot pairs for each corresponding Reynolds number. The fine-tuned models then correctly predict the evolution of the flow fields over many periods of cycles. We compare all results generated by models trained via SSL and models trained via supervised learning, where the former has unequivocally superior performance. We expect that the methodology presented here will have wider applications in fluid mechanics.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Self-supervised generative learning for sequential data prediction
    Xu, Ke
    Zhong, Guoqiang
    Deng, Zhaoyang
    Zhang, Kang
    Huang, Kaizhu
    [J]. APPLIED INTELLIGENCE, 2023, 53 (18) : 20675 - 20689
  • [32] Self-supervised generative learning for sequential data prediction
    Ke Xu
    Guoqiang Zhong
    Zhaoyang Deng
    Kang Zhang
    Kaizhu Huang
    [J]. Applied Intelligence, 2023, 53 : 20675 - 20689
  • [33] IMAGE ENHANCED ROTATION PREDICTION FOR SELF-SUPERVISED LEARNING
    Yamaguchi, Shinya
    Kanai, Sekitoshi
    Shioda, Tetsuya
    Takeda, Shoichiro
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 489 - 493
  • [34] Self-Supervised Representation Learning from Flow Equivariance
    Xiong, Yuwen
    Ren, Mengye
    Zeng, Wenyuan
    Urtasun, Raquel
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 10171 - 10180
  • [35] Optical flow for self-supervised learning of obstacle appearance
    Ho, H. W.
    De Wagter, C.
    Remes, B. D. W.
    de Croon, G. C. H. E.
    [J]. 2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2015, : 3098 - 3104
  • [36] IDSSL: Intent Detection Module Based on Self-Supervised Learning for Trajectory Prediction
    Jin, Chenyan
    Kuang, Sipeng
    Cui, Peng
    Zhang, Ya
    [J]. UNMANNED SYSTEMS, 2023,
  • [37] Prediction of freezing of gait based on self-supervised pretraining via contrastive learning
    Xia, Yi
    Sun, Hua
    Zhang, Baifu
    Xu, Yangyang
    Ye, Qiang
    [J]. BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 89
  • [38] Feature Extraction Based on Self-Supervised Learning for Remaining Useful Life Prediction
    Yu, Zhenjun
    Lei, Ningbo
    Mo, Yu
    Xu, Xin
    Li, Xiu
    Huang, Biqing
    [J]. JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING, 2024, 24 (02)
  • [39] Motif-based Graph Self-Supervised Learning for Molecular Property Prediction
    Zhang, Zaixi
    Liu, Qi
    Wang, Hao
    Lu, Chengqiang
    Lee, Chee-Kong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [40] A self-supervised temporal temperature prediction method based on dilated contrastive learning
    Lei, Yongxiang
    Chen, Xiaofang
    Xie, Yongfang
    Cen, Lihui
    [J]. JOURNAL OF PROCESS CONTROL, 2022, 120 : 150 - 158