Latent Class-Conditional Noise Model

被引:5
|
作者
Yao, Jiangchao [1 ,2 ]
Han, Bo [3 ]
Zhou, Zhihan [1 ,2 ]
Zhang, Ya [1 ,2 ]
Tsang, Ivor W. [4 ]
机构
[1] Shanghai Jiao Tong Univ, Cooperat Medianet Innovat Ctr, Shanghai 200240, Peoples R China
[2] Shanghai AI Lab, Shanghai 200030, Peoples R China
[3] Hong Kong Baptist Univ, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
[4] A STAR Ctr Frontier AI Res, Singapore 138632, Singapore
关键词
Noise measurement; Training; Optimization; Deep learning; Bayes methods; Robustness; Computational modeling; Bayesian modeling; deep learning; noisy supervision; semi-supervised learning; NETWORKS;
D O I
10.1109/TPAMI.2023.3247629
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning with noisy labels has become imperative in the Big Data era, which saves expensive human labors on accurate annotations. Previous noise-transition-based methods have achieved theoretically-grounded performance under the Class-Conditional Noisemodel (CCN). However, these approaches builds upon an ideal but impractical anchor set available to pre-estimate the noise transition. Even though subsequent works adapt the estimation as a neural layer, the ill-posed stochastic learning of its parameters in back-propagation easily falls into undesired local minimums. We solve this problem by introducing a Latent Class-Conditional Noise model (LCCN) to parameterize the noise transition under a Bayesian framework. By projecting the noise transition into the Dirichlet space, the learning is constrained on a simplex characterized by the complete dataset, instead of some ad-hoc parametric space wrapped by the neural layer. We then deduce a dynamic label regression method for LCCN, whose Gibbs sampler allows us efficiently infer the latent true labels to train the classifier and to model the noise. Our approach safeguards the stable update of the noise transition, which avoids previous arbitrarily tuning from a mini-batch of samples. We further generalize LCCN to different counterparts compatible with open-set noisy labels, semi-supervised learning as well as cross-model training. A range of experiments demonstrate the advantages of LCCN and its variants over the current state-of-the-art methods. The code is available at here.
引用
收藏
页码:9964 / 9980
页数:17
相关论文
共 50 条
  • [41] A comparative study of PCA, ICA and class-conditional ICA for Naive Bayes classifier
    Fan, Liwei
    Poh, Kim Leng
    COMPUTATIONAL AND AMBIENT INTELLIGENCE, 2007, 4507 : 16 - +
  • [42] Bayesian classification of cork stoppers using class-conditional independent component analysis
    Vitria, Jordi
    Bressan, Marco
    Radeva, Petia
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART C-APPLICATIONS AND REVIEWS, 2007, 37 (01): : 32 - 38
  • [43] CLASS-CONDITIONAL DEFENSE GAN AGAINST END-TO-END SPEECH ATTACKS
    Esmaeilpour, Mohammad
    Cardinal, Patrick
    Koerich, Alessandro Lameiras
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2565 - 2569
  • [44] Class-Conditional Sharpness-Aware Minimization for Deep Long-Tailed Recognition
    Zhou, Zhipeng
    Li, Lanqing
    Zhao, Peilin
    Heng, Pheng-Ann
    Gong, Wei
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 3499 - 3509
  • [45] Gaussian class-conditional simplex loss for accurate, adversarially robust deep classifier training
    Ali, Arslan
    Migliorati, Andrea
    Bianchi, Tiziano
    Magli, Enrico
    EURASIP JOURNAL ON INFORMATION SECURITY, 2023, 2023 (01)
  • [46] Convex and Non-Convex Approaches for Statistical Inference with Class-Conditional Noisy Labels
    Song, Hyebin
    Dai, Ran
    Raskutti, Garvesh
    Barber, Rina Foygel
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [47] Exponential Family of Level Dependent Choquet Integral Based Class-Conditional Probability Functions
    Torra, Vicenc
    Narukawa, Yasuo
    AGGREGATION FUNCTIONS IN THEORY AND IN PRACTISE, 2013, 228 : 477 - 484
  • [48] Latent class model with conditional dependency per modes to cluster categorical data
    Matthieu Marbac
    Christophe Biernacki
    Vincent Vandewalle
    Advances in Data Analysis and Classification, 2016, 10 : 183 - 207
  • [49] Latent class model with conditional dependency per modes to cluster categorical data
    Marbac, Matthieu
    Biernacki, Christophe
    Vandewalle, Vincent
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2016, 10 (02) : 183 - 207
  • [50] Convex and non-convex approaches for statistical inference with class-conditional noisy labels
    Song, Hyebin
    Dai, Ran
    Raskutti, Garvesh
    Barber, Rina Foygel
    Journal of Machine Learning Research, 2020, 21