Self-supervised learning based on Transformer for flow reconstruction and prediction

被引:0
|
作者
Xu, Bonan [1 ]
Zhou, Yuanye [2 ]
Bian, Xin [1 ]
机构
[1] Zhejiang Univ, Dept Engn Mech, State Key Lab Fluid Power & Mechatron Syst, Hangzhou 310027, Peoples R China
[2] Baidu Inc, Beijing 100085, Peoples R China
关键词
NEURAL-NETWORKS; DEEP;
D O I
10.1063/5.0188998
中图分类号
O3 [力学];
学科分类号
08 ; 0801 ;
摘要
Machine learning has great potential for efficient reconstruction and prediction of flow fields. However, existing datasets may have highly diversified labels for different flow scenarios, which are not applicable for training a model. To this end, we make a first attempt to apply the self-supervised learning (SSL) technique to fluid dynamics, which disregards data labels for pre-training the model. The SSL technique embraces a large amount of data (8000 snapshots) at Reynolds numbers of Re = 200, 300, 400, and 500 without discriminating between them, which improves the generalization of the model. The Transformer model is pre-trained via a specially designed pretext task, where it reconstructs the complete flow fields after randomly masking 20% data points in each snapshot. For the downstream task of flow reconstruction, the pre-trained model is fine-tuned separately with 256 snapshots for each Reynolds number. The fine-tuned models accurately reconstruct the complete flow fields based on less than 5% random data points within a limited window even for Re = 250 and 600, whose data were not seen in the pre-trained phase. For the other downstream task of flow prediction, the pre-training model is fine-tuned separately with 128 consecutive snapshot pairs for each corresponding Reynolds number. The fine-tuned models then correctly predict the evolution of the flow fields over many periods of cycles. We compare all results generated by models trained via SSL and models trained via supervised learning, where the former has unequivocally superior performance. We expect that the methodology presented here will have wider applications in fluid mechanics.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Traffic Prediction with Self-Supervised Learning: A Heterogeneity-Aware Model for Urban Traffic Flow Prediction Based on Self-Supervised Learning
    Gao, Min
    Wei, Yingmei
    Xie, Yuxiang
    Zhang, Yitong
    [J]. MATHEMATICS, 2024, 12 (09)
  • [2] Transformer-Based Self-Supervised Learning for Emotion Recognition
    Vazquez-Rodriguez, Juan
    Lefebvre, Gregoire
    Cumin, Julien
    Crowley, James L.
    [J]. 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2605 - 2612
  • [3] A Recommendation Algorithm Based on a Self-supervised Learning Pretrain Transformer
    Yu-Hao Xu
    Zhen-Hai Wang
    Zhi-Ru Wang
    Rong Fan
    Xing Wang
    [J]. Neural Processing Letters, 2023, 55 : 4481 - 4497
  • [4] A Recommendation Algorithm Based on a Self-supervised Learning Pretrain Transformer
    Xu, Yu-Hao
    Wang, Zhen-Hai
    Wang, Zhi-Ru
    Fan, Rong
    Wang, Xing
    [J]. NEURAL PROCESSING LETTERS, 2023, 55 (04) : 4481 - 4497
  • [5] Pavement anomaly detection based on transformer and self-supervised learning
    Lin, Zijie
    Wang, Hui
    Li, Shenglin
    [J]. AUTOMATION IN CONSTRUCTION, 2022, 143
  • [6] Self-Supervised Time Series Representation Learning via Cross Reconstruction Transformer
    Zhang, Wenrui
    Yang, Ling
    Geng, Shijia
    Hong, Shenda
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1 - 10
  • [7] TransDSSL: Transformer Based Depth Estimation via Self-Supervised Learning
    Han, Daechan
    Shin, Jeongmin
    Kim, Namil
    Hwang, Soonmin
    Choi, Yukyung
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (04): : 10969 - 10976
  • [8] Prediction of Protein Tertiary Structure Using Pre-Trained Self-Supervised Learning Based on Transformer
    Kurniawan, Alif
    Jatmiko, Wisnu
    Hertadi, Rukman
    Habibie, Novian
    [J]. 2020 5TH INTERNATIONAL WORKSHOP ON BIG DATA AND INFORMATION SECURITY (IWBIS 2020), 2020, : 75 - 80
  • [9] Spatio-Temporal Self-Supervised Learning for Traffic Flow Prediction
    Ji, Jiahao
    Wang, Jingyuan
    Huang, Chao
    Wu, Junjie
    Xu, Boren
    Wu, Zhenhe
    Zhang, Junbo
    Zheng, Yu
    [J]. THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 4, 2023, : 4356 - 4364
  • [10] Bearings RUL prediction based on contrastive self-supervised learning
    Deng, WeiKun
    Nguyen, Khanh T. P.
    Medjaher, Kamal
    Gogu, Christian
    Morio, Jerome
    [J]. IFAC PAPERSONLINE, 2023, 56 (02): : 11906 - 11911