Neural-network-augmented projection-based model order reduction for mitigating the Kolmogorov barrier to reducibility

被引:16
|
作者
Barnett, Joshua [1 ]
Farhat, Charbel [1 ,2 ,3 ]
Maday, Yvon [4 ,5 ]
机构
[1] Stanford Univ, Dept Mech Engn, Stanford, CA 94305 USA
[2] Stanford Univ, Dept Aeronaut & Astronaut, Stanford, CA 94305 USA
[3] Stanford Univ, Inst Computat & Math Engn, Stanford, CA 94305 USA
[4] Sorbonne Univ, Lab Jacques Louis LJLL L, F-75005 Paris, France
[5] Univ Paris Cite, CNRS, F-75005 Paris, France
关键词
Artificial neural network; Burgers; Kolmogorov n-width; Machine learning; Model reduction; Petrov-Galerkin; INTERPOLATION METHOD; HYPER REDUCTION; DECOMPOSITION;
D O I
10.1016/j.jcp.2023.112420
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Inspired by our previous work on a quadratic approximation manifold [1], we propose in this paper a computationally tractable approach for combining a projection-based reduced-order model (PROM) and an artificial neural network (ANN) to mitigate the Kolmogorov barrier to reducibility of parametric and/or highly nonlinear, high-dimensional, physics-based models. The main objective of our PROM-ANN concept is to reduce the dimensionality of the online approximation of the solution beyond what is achievable using affine and quadratic approximation manifolds, while maintaining accuracy. In contrast to previous approaches that exploited one form or another of an ANN, the training of the ANN part of our PROM-ANN does not involve data whose dimension scales with that of the high-dimensional model; and the resulting PROM-ANN can be efficiently hyperreduced using any well-established hyperreduction method. Hence, unlike many other ANN-based model order reduction approaches, the PROM-ANN concept we propose in this paper should be practical for large-scale and industry-relevant computational problems. We demonstrate the computational tractability of its offline stage and the superior wall clock time performance of its online stage for a large-scale, parametric, two-dimensional, model problem that is representative of shock-dominated unsteady flow problems.& COPY; 2023 Elsevier Inc. All rights reserved.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] On the stability of projection-based model order reduction for convection-dominated laminar and turbulent flows
    Grimberg, Sebastian
    Farhat, Charbel
    Youkilis, Noah
    JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 419
  • [22] Oblique Projection-Based Modal Matching Algorithm for LPV Model Order Reduction of Aeroservoelastic Systems
    Liu, Yishu
    Gao, Wei
    Li, Qifu
    Lu, Bei
    AEROSPACE, 2023, 10 (05)
  • [23] PROJECTION-BASED MODEL ORDER REDUCTION METHODS FOR THE ESTIMATION OF VECTOR-VALUED VARIABLES OF INTEREST
    Zahm, Olivier
    Billaud-Friess, Marie
    Nouy, Anthony
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2017, 39 (04): : A1647 - A1674
  • [24] Projection-Based Model Reduction Using Asymptotic Basis Functions
    Cassel, Kevin W.
    COMPUTATIONAL SCIENCE - ICCS 2019, PT IV, 2019, 11539 : 465 - 478
  • [26] Quantum element method for quantum eigenvalue problems derived from projection-based model order reduction
    Cheng, Ming-C.
    AIP ADVANCES, 2020, 10 (11)
  • [27] Efficient Simulation of Nonlinear Transmission Lines using Empirical Interpolation and Projection-Based Model Order Reduction
    Nouri, Behzad
    Nakhla, Michel
    2018 IEEE/MTT-S INTERNATIONAL MICROWAVE SYMPOSIUM - IMS, 2018, : 87 - 89
  • [28] Projection-based and neural-net reduced order model for nonlinear Navier-Stokes equations
    My Ha Dao
    Hoang Huy Nguyen
    Chin Chun Ooi
    Quang Tuyen Le
    APPLIED MATHEMATICAL MODELLING, 2021, 89 : 1294 - 1315
  • [29] Projection-based model reduction: Formulations for physics-based machine learning
    Swischuk, Renee
    Mainini, Laura
    Peherstorfer, Benjamin
    Willcox, Karen
    COMPUTERS & FLUIDS, 2019, 179 : 706 - 719
  • [30] Structured Projection-Based Model Reduction With Application to Stochastic Biochemical Networks
    Sootla, Aivar
    Anderson, James
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2017, 62 (11) : 5554 - 5566