Are random forests better suited than neural networks to augment RANS turbulence models?

被引:3
|
作者
Volpiani, Pedro Stefanin [1 ]
机构
[1] Off Natl Etud & Rech Aerosp, French Aerosp Lab, 8 Rue Vertugadins, F-92190 Meudon, France
关键词
RANS model; Machine Learning; Neural network; Random forests; Turbulence; EDDY SIMULATION;
D O I
10.1016/j.ijheatfluidflow.2024.109348
中图分类号
O414.1 [热力学];
学科分类号
摘要
Machine -learning (ML) techniques have bloomed in recent years, especially in fluid mechanics applications. In this paper, we trained, validated and compared two types of ML -based models to augment Reynoldsaveraged Navier-Stokes (RANS) simulations. The methodology was tested in a series of flows around bumps, characterized by different levels of flow separation and curvatures. Initially, the ML -based models were trained in three configurations presenting attached flow, small and moderate separation and tested in two configurations presenting incipient and large separation. The output quantity of the machine -learning model is the turbulent viscosity as done in Volpiani et al. (2022). The new models based on artificial neural networks (NN) and random forest (RF) improved the results if compared to the baseline Spalart-Allmaras model, in terms of velocity field and skin -friction profiles. We noted that NN has better extrapolation properties than RF, but the skin -friction distribution can present small oscillations when using certain input features. These oscillations can be reduced if the RF model is employed. One major advantages of RF is that raw quantities can be given as input features, avoiding normalization issues (such as division by zero) and allowing a larger number of universal inputs. At the end, we propose a mixed NN-RF model that combines the strengths of each method and, as a result, improves considerably the RANS prediction capability, even for a case with strong separation where the Boussinesq hypothesis (and therefore the eddy -viscosity assumption) lacks accuracy.
引用
收藏
页数:10
相关论文
共 50 条
  • [11] Algorithms of the Möbius function by random forests and neural networks
    Qin, Huan
    Ye, Yangbo
    JOURNAL OF BIG DATA, 2024, 11 (01)
  • [12] Comparing Explanations between Random Forests and Artificial Neural Networks
    Harris, Lee
    Grzes, Marek
    2019 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2019, : 2978 - 2985
  • [13] Algorithms of the Möbius function by random forests and neural networks
    Huan Qin
    Yangbo Ye
    Journal of Big Data, 11
  • [14] Two is better than one: a diploid genotype for neural networks
    Inst of Neural Systems and, Artificial Life, Rome, Italy
    Neural Process Letters, 3 (149-155):
  • [15] Ensembling neural networks: Many could be better than all
    Zhou, ZH
    Wu, JX
    Tang, W
    ARTIFICIAL INTELLIGENCE, 2002, 137 (1-2) : 239 - 263
  • [16] Two is better than one: A diploid genotype for neural networks
    Calabretta, R
    Galbiati, R
    Nolfi, S
    Parisi, D
    NEURAL PROCESSING LETTERS, 1996, 4 (03) : 149 - 155
  • [17] Are ARIMA neural network hybrids better than single models?
    Taskaya-Temizel, T
    Ahmad, K
    PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), VOLS 1-5, 2005, : 3192 - 3197
  • [18] Study on prediction models of oxygenated components content in biomass pyrolysis oil based on neural networks and random forests
    Zou, Yuqian
    Tian, Hong
    Huang, Zhangjun
    Liu, Lei
    Xuan, Yanni
    Dai, Jingchao
    Nie, Liubao
    BIOMASS & BIOENERGY, 2025, 193
  • [19] A mixture of shallow neural networks for virtual sensing: Could perform better than deep neural networks
    Shao, Weiming
    Li, Xu
    Xing, Yupeng
    Chen, Junghui
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 256
  • [20] Prediction of large magnetic moment materials with graph neural networks and random forests
    Kaba, Sekou-Oumar
    Groleau-Pare, Benjamin
    Gauthier, Marc-Antoine
    Tremblay, A. -m. s.
    Verret, Simon
    Gauvin-Ndiaye, Chloe
    PHYSICAL REVIEW MATERIALS, 2023, 7 (04)