Quantifying uncertainty for deep learning based forecasting and flow-reconstruction using neural architecture search ensembles

被引:1
|
作者
Maulik, Romit [1 ,2 ]
Egele, Romain [3 ]
Raghavan, Krishnan [4 ]
Balaprakash, Prasanna [5 ]
机构
[1] Penn State Univ, Informat Sci & Technol, University Pk, PA 16802 USA
[2] Penn State Univ, Inst Computat & Data Sci, University Pk, PA 16802 USA
[3] Univ Paris Saclay, Gif Sur Yvette, France
[4] Argonne Natl Lab, Lemont, IL 60439 USA
[5] Oak Ridge Natl Lab, 1 Bethel Valley Rd, Oak Ridge, TN 37831 USA
关键词
Deep ensembles; Scientific machine learning; Neural architecture and hyperparameter; search; PETROV-GALERKIN PROJECTION; MODEL-REDUCTION; MONTE-CARLO; EQUATIONS; QUANTIFICATION; DECOMPOSITION; SPACE;
D O I
10.1016/j.physd.2023.133852
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Classical problems in computational physics such as data-driven forecasting and signal reconstruction from sparse sensors have recently seen an explosion in deep neural network (DNN) based algorithmic approaches. However, most DNN models do not provide uncertainty estimates, which are crucial for establishing the trustworthiness of these techniques in downstream decision making tasks and scenarios. In recent years, ensemble-based methods have achieved significant success for the uncertainty quantification in DNNs on a number of benchmark problems. However, their performance on real-world applications remains under-explored. In this work, we present an automated approach to DNN discovery and demonstrate how this may also be utilized for ensemble-based uncertainty quantification. Specifically, we propose the use of a scalable neural and hyperparameter architecture search for discovering an ensemble of DNN models for complex dynamical systems. We highlight how the proposed method not only discovers high-performing neural network ensembles for our tasks, but also quantifies uncertainty seamlessly. This is achieved by using genetic algorithms and Bayesian optimization for sampling the search space of neural network architectures and hyperparameters. Subsequently, a model selection approach is used to identify candidate models for an ensemble set construction. Afterwards, a variance decomposition approach is used to estimate the uncertainty of the predictions from the ensemble. We demonstrate the feasibility of this framework for two tasks - forecasting from historical data and flow reconstruction from sparse sensors for the sea-surface temperature. We demonstrate superior performance from the ensemble in contrast with individual high-performing models and other benchmarks.(c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Quantifying the uncertainty of precipitation forecasting using probabilistic deep learning
    Xu, Lei
    Chen, Nengcheng
    Yang, Chao
    Yu, Hongchu
    Chen, Zeqiang
    [J]. HYDROLOGY AND EARTH SYSTEM SCIENCES, 2022, 26 (11) : 2923 - 2938
  • [2] Better wind forecasting using Evolutionary Neural Architecture search driven Green Deep Learning
    Pujari, Keerthi Nagasree
    Miriyala, Srinivas Soumitri
    Mittal, Prateek
    Mitra, Kishalay
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 214
  • [3] Predictive uncertainty in deep learning-based MR image reconstruction using deep ensembles: Evaluation on the fastMRI data set
    Kuestner, Thomas
    Hammernik, Kerstin
    Rueckert, Daniel
    Hepp, Tobias
    Gatidis, Sergios
    [J]. MAGNETIC RESONANCE IN MEDICINE, 2024, 92 (01) : 289 - 302
  • [4] Deep Active Learning with a Neural Architecture Search
    Geifman, Yonatan
    El-Yaniv, Ran
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [5] Quantifying Uncertainty in Neural Network Ensembles using U-Statistics
    Schupbach, Jordan
    Sheppard, John W.
    Forrester, Tyler
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [6] Pedestrian Trajectory Forecasting Using Deep Ensembles Under Sensing Uncertainty
    Nayak, Anshul
    Eskandarian, Azim
    Doerzaph, Zachary
    Ghorai, Prasenjit
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2024, 25 (09) : 11317 - 11329
  • [7] Automated Deep Learning: Neural Architecture Search Is Not the End
    Dong, Xuanyi
    Kedziora, David Jacob
    Musial, Katarzyna
    Gabrys, Bogdan
    [J]. FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2024, 17 (05): : 767 - 920
  • [8] Learning deep morphological networks with neural architecture search
    Hu, Yufei
    Belkhir, Nacim
    Angulo, Jesus
    Yao, Angela
    Franchi, Gianni
    [J]. PATTERN RECOGNITION, 2022, 131
  • [9] Improving Deep Neural Network Ensembles using Reconstruction Error
    Huang, Wenhao
    Hong, Haikun
    Bian, Kaigui
    Zhou, Xiabing
    Song, Guojie
    Xie, Kunqing
    [J]. 2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [10] Scalable Reinforcement-Learning-Based Neural Architecture Search for Cancer Deep Learning Research
    Balaprakash, Prasanna
    Egele, Romain
    Salim, Misha
    Wild, Stefan
    Vishwanath, Venkatram
    Xia, Fangfang
    Brettin, Tom
    Stevens, Rick
    [J]. PROCEEDINGS OF SC19: THE INTERNATIONAL CONFERENCE FOR HIGH PERFORMANCE COMPUTING, NETWORKING, STORAGE AND ANALYSIS, 2019,