Learning Noise Transition Matrix from Only Noisy Labels via Total Variation Regularization

被引:0
|
作者
Zhang, Yivan [1 ,2 ]
Niu, Gang [2 ]
Sugiyama, Masashi [1 ,2 ]
机构
[1] Univ Tokyo, Tokyo, Japan
[2] RIKEN AIP, Tokyo, Japan
关键词
CLASSIFICATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many weakly supervised classification methods employ a noise transition matrix to capture the class-conditional label corruption. To estimate the transition matrix from noisy data, existing methods often need to estimate the noisy class-posterior, which could be unreliable due to the overconfidence of neural networks. In this work, we propose a theoretically grounded method that can estimate the noise transition matrix and learn a classifier simultaneously, without relying on the error-prone noisy class-posterior estimation. Concretely, inspired by the characteristics of the stochastic label corruption process, we propose total variation regularization, which encourages the predicted probabilities to be more distinguishable from each other. Under mild assumptions, the proposed method yields a consistent estimator of the transition matrix. We show the effectiveness of the proposed method through experiments on benchmark and real-world datasets.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Learning with Noisy Labels via Sparse Regularization
    Zhou, Xiong
    Liu, Xianming
    Wang, Chenyang
    Zhai, Deming
    Jiang, Junjun
    Ji, Xiangyang
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 72 - 81
  • [2] MATRIX SMOOTHING: A REGULARIZATION FOR DNN WITH TRANSITION MATRIX UNDER NOISY LABELS
    Lv, Xianbin
    Wu, Dongxian
    Xia, Shu-Tao
    2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2020,
  • [3] Learning Deep Networks from Noisy Labels with Dropout Regularization
    Jindal, Ishan
    Nokleby, Matthew
    Chen, Xuewen
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 967 - 972
  • [4] Consistency Regularization on Clean Samples for Learning with Noisy Labels
    Nomura, Yuichiro
    Kurita, Takio
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (02) : 387 - 395
  • [5] Towards Federated Learning against Noisy Labels via Local Self-Regularization
    Jiang, Xuefeng
    Sun, Sheng
    Wang, Yuwei
    Liu, Min
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 862 - 873
  • [6] Learning with Noisy Labels by Efficient Transition Matrix Estimation to Combat Label Miscorrection
    Kye, Seong Min
    Choi, Kwanghee
    Yi, Joonyoung
    Chang, Buru
    COMPUTER VISION, ECCV 2022, PT XXV, 2022, 13685 : 717 - 738
  • [7] Learning From Noisy Labels Via Dynamic Loss Thresholding
    Yang H.
    Jin Y.
    Li Z.
    Wang D.
    Geng X.
    Zhang M.
    IEEE Transactions on Knowledge and Data Engineering, 2024, 36 (11) : 1 - 14
  • [8] Learning from Noisy Labels via Discrepant Collaborative Training
    Han, Yan
    Roy, Soumava Kumar
    Petersson, Lars
    Harandi, Mehrtash
    2020 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2020, : 3158 - 3167
  • [9] Efficient In SAR phase noise reduction via total variation regularization
    LUO XiaoMei
    WANG XiangFeng
    SUO ZhiYong
    LI ZhenFang
    Science China(Information Sciences), 2015, 58 (08) : 64 - 76
  • [10] PNP: Robust Learning from Noisy Labels by Probabilistic Noise Prediction
    Sun, Zeren
    Shen, Fumin
    Huang, Dan
    Wang, Qiong
    Shu, Xiangbo
    Yao, Yazhou
    Tang, Jinhui
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 5301 - 5310