Reduced order modeling for flow and transport problems with Barlow Twins self-supervised learning

被引:0
|
作者
Teeratorn Kadeethum
Francesco Ballarin
Daniel O’Malley
Youngsoo Choi
Nikolaos Bouklas
Hongkyu Yoon
机构
[1] Sandia National Laboratories,
[2] Cornell University,undefined
[3] Catholic University of the Sacred Heart,undefined
[4] Los Alamos National Laboratory,undefined
[5] Lawrence Livermore National Laboratory,undefined
来源
关键词
D O I
暂无
中图分类号
学科分类号
摘要
We propose a unified data-driven reduced order model (ROM) that bridges the performance gap between linear and nonlinear manifold approaches. Deep learning ROM (DL-ROM) using deep-convolutional autoencoders (DC–AE) has been shown to capture nonlinear solution manifolds but fails to perform adequately when linear subspace approaches such as proper orthogonal decomposition (POD) would be optimal. Besides, most DL-ROM models rely on convolutional layers, which might limit its application to only a structured mesh. The proposed framework in this study relies on the combination of an autoencoder (AE) and Barlow Twins (BT) self-supervised learning, where BT maximizes the information content of the embedding with the latent space through a joint embedding architecture. Through a series of benchmark problems of natural convection in porous media, BT–AE performs better than the previous DL-ROM framework by providing comparable results to POD-based approaches for problems where the solution lies within a linear subspace as well as DL-ROM autoencoder-based techniques where the solution lies on a nonlinear manifold; consequently, bridges the gap between linear and nonlinear reduced manifolds. We illustrate that a proficient construction of the latent space is key to achieving these results, enabling us to map these latent spaces using regression models. The proposed framework achieves a relative error of 2% on average and 12% in the worst-case scenario (i.e., the training data is small, but the parameter space is large.). We also show that our framework provides a speed-up of 7×106\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$7 \times 10^{6}$$\end{document} times, in the best case, and 7×103\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$7 \times 10^{3}$$\end{document} times on average compared to a finite element solver. Furthermore, this BT–AE framework can operate on unstructured meshes, which provides flexibility in its application to standard numerical solvers, on-site measurements, experimental data, or a combination of these sources.
引用
收藏
相关论文
共 50 条
  • [1] Reduced order modeling for flow and transport problems with Barlow Twins self-supervised learning
    Kadeethum, Teeratorn
    Ballarin, Francesco
    O'Malley, Daniel
    Choi, Youngsoo
    Bouklas, Nikolaos
    Yoon, Hongkyu
    SCIENTIFIC REPORTS, 2022, 12 (01)
  • [2] Barlow Twins: Self-Supervised Learning via Redundancy Reduction
    Zbontar, Jure
    Jing, Li
    Misra, Ishan
    LeCun, Yann
    Deny, Stephane
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [3] Barlow Twins self-supervised learning for robust speaker recognition
    Mohammadamini, Mohammad
    Matrouf, Driss
    Bonastre, Jean-Francois
    Dowerah, Sandipana
    Serizel, Romain
    Jouvet, Denis
    INTERSPEECH 2022, 2022, : 4033 - 4037
  • [4] A Self-supervised Graph Autoencoder with Barlow Twins
    Li, Jingci
    Lu, Guangquan
    Li, Jiecheng
    PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II, 2022, 13630 : 501 - 512
  • [5] Graph Barlow Twins: A self-supervised representation learning framework for graphs
    Bielak, Piotr
    Kajdanowicz, Tomasz
    Chawla, Nitesh V.
    KNOWLEDGE-BASED SYSTEMS, 2022, 256
  • [6] Boosting Barlow Twins Reduced Order Modeling for Machine Learning-Based Surrogate Models in Multiphase Flow Problems
    Kadeethum, T.
    Silva, V. L. S.
    Salinas, P.
    Pain, C. C.
    Yoon, H.
    WATER RESOURCES RESEARCH, 2024, 60 (10)
  • [7] Continual Barlow Twins: Continual Self-Supervised Learning for Remote Sensing Semantic Segmentation
    Marsocci, Valerio
    Scardapane, Simone
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2023, 16 : 5049 - 5060
  • [8] Epistemic Uncertainty-Aware Barlow Twins Reduced Order Modeling for Nonlinear Contact Problems
    Kadeethum, Teeratorn
    Jakeman, John D.
    Choi, Youngsoo
    Bouklas, Nikolaos
    Yoon, Hongkyu
    IEEE ACCESS, 2023, 11 : 62970 - 62985
  • [9] Lead-fusion Barlow twins: A fused self-supervised learning method for multi-lead electrocardiograms
    Liu, Wenhan
    Pan, Shurong
    Li, Zhoutong
    Chang, Sheng
    Huang, Qijun
    Jiang, Nan
    INFORMATION FUSION, 2025, 114
  • [10] Supervised Learning for Convolutional Neural Network with Barlow Twins
    Murugan, Ramyaa
    Mojoo, Jonathan
    Kurita, Takio
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 484 - 495