Uncertainty quantification metrics for deep regression

被引:0
|
作者
Lind, Simon Kristoffersson [1 ]
Xiong, Ziliang [2 ]
Forssen, Per-Erik [2 ]
Kruger, Volker [1 ]
机构
[1] Lund Univ LTH, Lund, Sweden
[2] Linkoping Univ, Linkoping, Sweden
基金
瑞典研究理事会;
关键词
Uncertainty; Evaluation; Metrics; Regression;
D O I
10.1016/j.patrec.2024.09.011
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When deploying deep neural networks on robots or other physical systems, the learned model should reliably quantify predictive uncertainty. A reliable uncertainty allows downstream modules to reason about the safety of its actions. In this work, we address metrics for uncertainty quantification. Specifically, we focus on regression tasks, and investigate Area Under Sparsification Error (AUSE), Calibration Error (CE), Spearman's Rank Correlation, and Negative Log-Likelihood (NLL). Using multiple datasets, we look into how those metrics behave under four typical types of uncertainty, their stability regarding the size of the test set, and reveal their strengths and weaknesses. Our results indicate that Calibration Error is the most stable and interpretable metric, but AUSE and NLL also have their respective use cases. We discourage the usage of Spearman's Rank Correlation for evaluating uncertainties and recommend replacing it with AUSE.
引用
下载
收藏
页码:91 / 97
页数:7
相关论文
共 50 条
  • [31] Polynomial Regression Approaches Using Derivative Information for Uncertainty Quantification
    Roderick, Oleg
    Anitescu, Mihai
    Fischer, Paul
    NUCLEAR SCIENCE AND ENGINEERING, 2010, 164 (02) : 122 - 139
  • [32] Adaptive Locally Weighted Projection Regression Method for Uncertainty Quantification
    Chen, Peng
    Zabaras, Nicholas
    COMMUNICATIONS IN COMPUTATIONAL PHYSICS, 2013, 14 (04) : 851 - 878
  • [33] Calibration after bootstrap for accurate uncertainty quantification in regression models
    Glenn Palmer
    Siqi Du
    Alexander Politowicz
    Joshua Paul Emory
    Xiyu Yang
    Anupraas Gautam
    Grishma Gupta
    Zhelong Li
    Ryan Jacobs
    Dane Morgan
    npj Computational Materials, 8
  • [34] Calibration after bootstrap for accurate uncertainty quantification in regression models
    Palmer, Glenn
    Du, Siqi
    Politowicz, Alexander
    Emory, Joshua Paul
    Yang, Xiyu
    Gautam, Anupraas
    Gupta, Grishma
    Li, Zhelong
    Jacobs, Ryan
    Morgan, Dane
    NPJ COMPUTATIONAL MATERIALS, 2022, 8 (01)
  • [35] Uncertainty Quantification in Depth Estimation via Constrained Ordinal Regression
    Hu, Dongting
    Peng, Liuhua
    Chu, Tingjin
    Zhang, Xiaoxing
    Mao, Yinian
    Bondell, Howard
    Gong, Mingming
    COMPUTER VISION - ECCV 2022, PT II, 2022, 13662 : 237 - 256
  • [36] ORTHOGONAL BASES FOR POLYNOMIAL REGRESSION WITH DERIVATIVE INFORMATION IN UNCERTAINTY QUANTIFICATION
    Li, Yiou
    Anitescu, Mihai
    Roderick, Oleg
    Hickernell, Fred
    INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION, 2011, 1 (04) : 297 - 320
  • [37] Uncover This Tech Term: Uncertainty Quantification for Deep Learning
    Faghani, Shahriar
    Gamble, Cooper
    Erickson, Bradley J.
    KOREAN JOURNAL OF RADIOLOGY, 2024, 25 (04) : 395 - 398
  • [38] DEEP LEARNING OF PARAMETERIZED EQUATIONS WITH APPLICATIONS TO UNCERTAINTY QUANTIFICATION
    Qin, Tong
    Chen, Zhen
    Jakeman, John D.
    Xiu, Dongbin
    INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION, 2021, 11 (02) : 63 - 82
  • [39] A BACKWARD SDE METHOD FOR UNCERTAINTY QUANTIFICATION IN DEEP LEARNING
    Archibald, Richard
    Bao, Feng
    Cao, Yanzhao
    Zhang, He
    DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS-SERIES S, 2022, 15 (10): : 2807 - 2835
  • [40] Uncertainty Quantification for Deep Learning in Ultrasonic Crack Characterization
    Pyle, Richard J.
    Hughes, Robert R.
    Ali, Amine Ait Si
    Wilcox, Paul D.
    IEEE TRANSACTIONS ON ULTRASONICS FERROELECTRICS AND FREQUENCY CONTROL, 2022, 69 (07) : 2339 - 2351