Physical invariance in neural networks for subgrid-scale scalar flux modeling

被引:29
|
作者
Frezat, Hugo [1 ]
Balarac, Guillaume [1 ,2 ]
Le Sommer, Julien [1 ]
Fablet, Ronan [3 ]
Lguensat, Redouane [4 ,5 ]
机构
[1] Univ Grenoble Alpes, CNRS UMR LEGI, Grenoble, France
[2] Inst Univ France IUF, Paris, France
[3] IMT Atlantique, CNRS UMR Lab STICC, Brest, France
[4] IPSL CEA, Lab Sci Climat & Environm LSCE, Gif Sur Yvette, France
[5] Sorbonne Univ, Inst Pierre Simon Laplace, LOCEAN IPSL, Paris, France
关键词
TURBULENCE; TEMPERATURE;
D O I
10.1103/PhysRevFluids.6.024607
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
In this paper we present a new strategy to model the subgrid-scale scalar flux in a three-dimensional turbulent incompressible flow using physics-informed neural networks (NNs). When trained from direct numerical simulation (DNS) data, state-of-the-art neural networks, such as convolutional neural networks, may not preserve well-known physical priors, which may in turn question their application to real case-studies. To address this issue, we investigate hard and soft constraints into the model based on classical transformation invariances and symmetries derived from physical laws. From simulation-based experiments, we show that the proposed transformation-invariant NN model outperforms both purely data-driven ones as well as parametric state-of-the-art subgrid-scale models. The considered invariances are regarded as regularizers on physical metrics during the a priori evaluation and constrain the distribution tails of the predicted subgrid-scale term to be closer to the DNS. They also increase the stability and performance of the model when used as a surrogate during a large-eddy simulation. Moreover, the transformation-invariant NN is shown to generalize to regimes that have not been seen during the training phase.
引用
收藏
页数:22
相关论文
共 50 条