Combining Contrastive Learning with Auto-Encoder for Out-of-Distribution Detection

被引:0
|
作者
Luo, Dawei [1 ]
Zhou, Heng [2 ]
Bae, Joonsoo [1 ]
Yun, Bom [3 ]
机构
[1] Jeonbuk Natl Univ, Dept Ind & Informat Syst Engn, Jeonju 54896, South Korea
[2] Jeonbuk Natl Univ, Dept Elect & Informat Engn, Jeonju 54896, South Korea
[3] Korean Construct Equipment Technol Inst, Gunsan 10203, South Korea
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 23期
关键词
contrastive learning; auto-encoder; out-of-distribution; representation learning; unsupervised learning; FAULT-DIAGNOSIS; AUTOENCODER;
D O I
10.3390/app132312930
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Reliability and robustness are fundamental requisites for the successful integration of deep-learning models into real-world applications. Deployed models must exhibit an awareness of their limitations, necessitating the ability to discern out-of-distribution (OOD) data and prompt human intervention, a critical competency. While several frameworks for OOD detection have been introduced and achieved remarkable results, most state-of-the-art (SOTA) models rely on supervised learning with annotated data for their training. However, acquiring labeled data can be a demanding, time-consuming or, in some cases, an infeasible task. Consequently, unsupervised learning has gained substantial traction and has made noteworthy advancements. It empowers models to undergo training solely on unlabeled data while still achieving comparable or even superior performance compared to supervised alternatives. Among the array of unsupervised methods, contrastive learning has asserted its effectiveness in feature extraction for a variety of downstream tasks. Conversely, auto-encoders are extensively employed to acquire indispensable representations that faithfully reconstruct input data. In this study, we introduce a novel approach that amalgamates contrastive learning with auto-encoders for OOD detection using unlabeled data. Contrastive learning diligently tightens the grouping of in-distribution data while meticulously segregating OOD data, and the auto-encoder augments the feature space with increased refinement. Within this framework, data undergo implicit classification into in-distribution and OOD categories with a notable degree of precision. Our experimental findings manifest that this method surpasses most of the existing detectors reliant on unlabeled data or even labeled data. By incorporating an auto-encoder into an unsupervised learning framework and training it on the CIFAR-100 dataset, our model enhances the detection rate of unsupervised learning methods by an average of 5.8%. Moreover, it outperforms the supervised-based OOD detector by an average margin of 11%.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] Out-of-distribution Detection with Boundary Aware Learning
    Pei, Sen
    Zhang, Xin
    Fan, Bin
    Meng, Gaofeng
    COMPUTER VISION, ECCV 2022, PT XXIV, 2022, 13684 : 235 - 251
  • [22] Out-Of-Distribution Detection In Unsupervised Continual Learning
    He, Jiangpeng
    Zhu, Fengqing
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 3849 - 3854
  • [23] Negative as Positive: Enhancing Out-of-distribution Generalization for Graph Contrastive Learning
    Wang, Zixu
    Xu, Bingbing
    Yuan, Yige
    Shen, Huawei
    Cheng, Xueqi
    PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 2548 - 2552
  • [24] Sparse Tensor Auto-Encoder for Saliency Detection
    Yang, Shuyuan
    Wang, Junxiao
    IEEE ACCESS, 2020, 8 : 2924 - 2930
  • [25] Learning a good representation with unsymmetrical auto-encoder
    Sun, Yanan
    Mao, Hua
    Guo, Quan
    Yi, Zhang
    NEURAL COMPUTING & APPLICATIONS, 2016, 27 (05): : 1361 - 1367
  • [26] Contrastive variational auto-encoder driven convergence guidance in evolutionary multitasking
    Wang, Ruilin
    Feng, Xiang
    Yu, Huiqun
    APPLIED SOFT COMPUTING, 2024, 163
  • [27] Parallel Auto-encoder for Efficient Outlier Detection
    Ma, Yunlong
    Zhang, Peng
    Cao, Yanan
    Guo, Li
    2013 IEEE INTERNATIONAL CONFERENCE ON BIG DATA, 2013,
  • [28] Online deep learning based on auto-encoder
    Zhang, Si-si
    Liu, Jian-wei
    Zuo, Xin
    Lu, Run-kun
    Lian, Si-ming
    APPLIED INTELLIGENCE, 2021, 51 (08) : 5420 - 5439
  • [29] Discriminative Representation Learning with Supervised Auto-encoder
    Fang Du
    Jiangshe Zhang
    Nannan Ji
    Junying Hu
    Chunxia Zhang
    Neural Processing Letters, 2019, 49 : 507 - 520
  • [30] Auto-Encoder based Structured Dictinoary Learning
    Liu, Deyin
    Wu, Yuanbo Lin
    Liu, Liangchen
    Hu, Qichang
    Qi, Lin
    2020 IEEE 22ND INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING (MMSP), 2020,