Latent Class-Conditional Noise Model

被引:5
|
作者
Yao, Jiangchao [1 ,2 ]
Han, Bo [3 ]
Zhou, Zhihan [1 ,2 ]
Zhang, Ya [1 ,2 ]
Tsang, Ivor W. [4 ]
机构
[1] Shanghai Jiao Tong Univ, Cooperat Medianet Innovat Ctr, Shanghai 200240, Peoples R China
[2] Shanghai AI Lab, Shanghai 200030, Peoples R China
[3] Hong Kong Baptist Univ, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
[4] A STAR Ctr Frontier AI Res, Singapore 138632, Singapore
关键词
Noise measurement; Training; Optimization; Deep learning; Bayes methods; Robustness; Computational modeling; Bayesian modeling; deep learning; noisy supervision; semi-supervised learning; NETWORKS;
D O I
10.1109/TPAMI.2023.3247629
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning with noisy labels has become imperative in the Big Data era, which saves expensive human labors on accurate annotations. Previous noise-transition-based methods have achieved theoretically-grounded performance under the Class-Conditional Noisemodel (CCN). However, these approaches builds upon an ideal but impractical anchor set available to pre-estimate the noise transition. Even though subsequent works adapt the estimation as a neural layer, the ill-posed stochastic learning of its parameters in back-propagation easily falls into undesired local minimums. We solve this problem by introducing a Latent Class-Conditional Noise model (LCCN) to parameterize the noise transition under a Bayesian framework. By projecting the noise transition into the Dirichlet space, the learning is constrained on a simplex characterized by the complete dataset, instead of some ad-hoc parametric space wrapped by the neural layer. We then deduce a dynamic label regression method for LCCN, whose Gibbs sampler allows us efficiently infer the latent true labels to train the classifier and to model the noise. Our approach safeguards the stable update of the noise transition, which avoids previous arbitrarily tuning from a mini-batch of samples. We further generalize LCCN to different counterparts compatible with open-set noisy labels, semi-supervised learning as well as cross-model training. A range of experiments demonstrate the advantages of LCCN and its variants over the current state-of-the-art methods. The code is available at here.
引用
收藏
页码:9964 / 9980
页数:17
相关论文
共 50 条
  • [1] Class-Conditional Label Noise in Astroparticle Physics
    Bunse, Mirko
    Pfahler, Lukas
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI, 2023, 14174 : 19 - 35
  • [2] Hypothesis Testing for Class-Conditional Label Noise
    Poyiadzi, Rafael
    Yang, Weisong
    Twomey, Niall
    Santos-Rodriguez, Raul
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT III, 2023, 13715 : 171 - 186
  • [3] Hypothesis Testing for Class-Conditional Noise Using Local Maximum Likelihood
    Yang, Weisong
    Poyiadzi, Rafael
    Twomey, Niall
    Santos-Rodriguez, Raul
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 19, 2024, : 21744 - 21752
  • [4] Learning a metric for class-conditional KNN
    Im, Daniel Jiwoong
    Taylor, Graham W.
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 1932 - 1939
  • [5] CCGG: A Deep Autoregressive Model for Class-Conditional Graph Generation
    Ommi, Yassaman
    Yousefabadi, Matin
    Faez, Faezeh
    Sabour, Amirmojtaba
    Baghshah, Mahdieh Soleymani
    Rabiee, Hamid R.
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 1092 - 1098
  • [6] CCMN: A General Framework for Learning With Class-Conditional Multi-Label Noise
    Xie, Ming-Kun
    Huang, Sheng-Jun
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (01) : 154 - 166
  • [7] Species Distribution Modeling of Citizen Science Data as a Classification Problem with Class-Conditional Noise
    Hutchinson, Rebecca A.
    He, Liqiang
    Emerson, Sarah C.
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 4516 - 4523
  • [8] Class-Conditional Conformal Prediction with Many Classes
    Ding, Tiffany
    Angelopoulos, Anastasios N.
    Bates, Stephen
    Jordan, Michael I.
    Tibshirani, Ryan J.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] Class-conditional domain adaptation for semantic segmentation
    Wang, Yue
    Li, Yuke
    Elder, James H.
    Wu, Runmin
    Lu, Huchuan
    COMPUTATIONAL VISUAL MEDIA, 2024, 10 (05) : 1013 - 1030
  • [10] Learning Class-Conditional GANs with Active Sampling
    Xie, Ming-Kun
    Huang, Sheng-Jun
    KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 998 - 1006