Principled Out-of-Distribution Detection via Multiple Testing

被引:0
|
作者
Magesh, Akshayaa [1 ]
Veeravalli, Venugopal V. [1 ]
Roy, Anirban [2 ]
Jha, Susmit [2 ]
机构
[1] Univ Illinois, Dept Elect & Comp Engn, Champaign, IL 61820 USA
[2] SRI Int, Comp Sci Lab, Menlo Pk, CA 94061 USA
基金
美国国家科学基金会;
关键词
OOD characterization; Conformal p-values; Conditional False Alarm Guarantees; Benjamini-Hochberg procedure; FALSE DISCOVERY RATE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study the problem of out-of-distribution (OOD) detection, that is, detecting whether a machine learning (ML) model's output can be trusted at inference time. While a number of tests for OOD detection have been proposed in prior work, a formal framework for studying this problem is lacking. We propose a definition for the notion of OOD that includes both the input distribution and the ML model, which provides insights for the construction of powerful tests for OOD detection. We also propose a multiple hypothesis testing inspired procedure to systematically combine any number of different statistics from the ML model using conformal p-values. We further provide strong guarantees on the probability of incorrectly classifying an in-distribution sample as OOD. In our experiments, we find that threshold-based tests proposed in prior work perform well in specific settings, but not uniformly well across different OOD instances. In contrast, our proposed method that combines multiple statistics performs uniformly well across different datasets and neural networks architectures.
引用
收藏
页数:35
相关论文
共 50 条
  • [1] Out-of-distribution Detection Learning with Unreliable Out-of-distribution Sources
    Zheng, Haotian
    Wang, Qizhou
    Fang, Zhen
    Xia, Xiaobo
    Liu, Feng
    Liu, Tongliang
    Han, Bo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [2] On the Learnability of Out-of-distribution Detection
    Fang, Zhen
    Li, Yixuan
    Liu, Feng
    Han, Bo
    Lu, Jie
    Journal of Machine Learning Research, 2024, 25
  • [3] Watermarking for Out-of-distribution Detection
    Wang, Qizhou
    Liu, Feng
    Zhang, Yonggang
    Zhang, Jing
    Gong, Chen
    Liu, Tongliang
    Han, Bo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [4] Out-of-Distribution Detection using Multiple Semantic Label Representations
    Shalev, Gabi
    Adi, Yossi
    Keshet, Joseph
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [5] Entropic Out-of-Distribution Detection
    Macedo, David
    Ren, Tsang Ing
    Zanchettin, Cleber
    Oliveira, Adriano L., I
    Ludermir, Teresa
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] On the Learnability of Out-of-distribution Detection
    Fang, Zhen
    Li, Yixuan
    Liu, Feng
    Han, Bo
    Lu, Jie
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [7] Is Out-of-Distribution Detection Learnable?
    Fang, Zhen
    Li, Yixuan
    Lu, Jie
    Dong, Jiahua
    Han, Bo
    Liu, Feng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [8] Out-of-Distribution Detection via Conditional Kernel Independence Model
    Wang, Yu
    Zou, Jingjing
    Lin, Jingyang
    Ling, Qing
    Pan, Yingwei
    Yao, Ting
    Mei, Tao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [9] In- or Out-of-Distribution Detection via Dual Divergence Estimation
    Garg, Sahil
    Dutta, Sanghamitra
    Dalirrooyfard, Mina
    Schneider, Anderson
    Nevmyvaka, Yuriy
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2023, 216 : 635 - 646
  • [10] Out-of-Distribution Detection for Automotive Perception
    Nitsch, Julia
    Itkina, Masha
    Senanayake, Ransalu
    Nieto, Juan
    Schmidt, Max
    Siegwart, Roland
    Kochenderfer, Mykel J.
    Cadena, Cesar
    2021 IEEE INTELLIGENT TRANSPORTATION SYSTEMS CONFERENCE (ITSC), 2021, : 2938 - 2943