Three Heads Are Better than One: Complementary Experts for Long-Tailed Semi-supervised Learning

被引:0
|
作者
Ma, Chengcheng [1 ,2 ]
Elezi, Ismail [3 ]
Deng, Jiankang [3 ]
Dong, Weiming [1 ]
Xu, Changsheng [1 ]
机构
[1] Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing, Peoples R China
[3] Huawei Noahs Ark Lab, London, England
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We address the challenging problem of Long-Tailed Semi-Supervised Learning (LTSSL) where labeled data exhibit imbalanced class distribution and unlabeled data follow an unknown distribution. Unlike in balanced SSL, the generated pseudo-labels are skewed towards head classes, intensifying the training bias. Such a phenomenon is even amplified as more unlabeled data will be mislabeled as head classes when the class distribution of labeled and unlabeled datasets are mismatched. To solve this problem, we propose a novel method named ComPlementary Experts (CPE). Specifically, we train multiple experts to model various class distributions, each of them yielding high-quality pseudo-labels within one form of class distribution. Besides, we introduce Classwise Batch Normalization for CPE to avoid performance degradation caused by feature distribution mismatch between head and non-head classes. CPE achieves state-of-the-art performances on CIFAR-10-LT, CIFAR-100-LT, and STL-10-LT dataset benchmarks. For instance, on CIFAR-10-LT, CPE improves test accuracy by over >2.22% compared to baselines. Code is available at https://github.com/machengcheng2016/CPE-LTSSL.
引用
收藏
页码:14229 / 14237
页数:9
相关论文
共 50 条
  • [21] Self-Supervised Graph Learning for Long-Tailed Cognitive Diagnosis
    Wang, Shanshan
    Zeng, Zhen
    Yang, Xun
    Zhang, Xingyi
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 1, 2023, : 110 - 118
  • [22] Self-Supervised Aggregation of Diverse Experts for Test-Agnostic Long-Tailed Recognition
    Zhang, Yifan
    Hooi, Bryan
    Hong, Lanqing
    Feng, Jiashi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [23] Consensus and Risk Aversion Learning in Ensemble of Multiple Experts for Long-Tailed Classification
    Ha, Taegil
    Choi, Jin Young
    IEEE ACCESS, 2024, 12 : 97883 - 97892
  • [24] Improving Experts' Wine Quality Judgments: Two Heads Are Better than One
    Ashton, Robert H.
    JOURNAL OF WINE ECONOMICS, 2011, 6 (02) : 160 - 178
  • [25] Semi-Supervised SAR ATR Based on Contrastive Learning and Complementary Label Learning
    Li, Chen
    Du, Lan
    Du, Yuang
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2024, 21
  • [26] Progressively Balanced Supervised Contrastive Representation Learning for Long-Tailed Fault Diagnosis
    Peng, Peng
    Lu, Jiaxun
    Tao, Shuting
    Ma, Ke
    Zhang, Yi
    Wang, Hongwei
    Zhang, Heming
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [27] No One Left Behind: Improving the Worst Categories in Long-Tailed Learning
    Du, Yingxiao
    Wu, Jianxin
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 15804 - 15813
  • [28] Semi-supervised life-long learning with application to sensing
    Liu, Qiuhua
    Liao, Xuejun
    Carin, Lawrence
    2007 2ND IEEE INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING, 2007, : 9 - 12
  • [29] Breast Cancer Screening: Two (or Three) Heads Are Better than One?
    Rosenberg, Robert D.
    Seidenwurm, David
    RADIOLOGY, 2018, 287 (03) : 758 - 760
  • [30] Semi-supervised learning based on one-class classification and ensemble learning
    Pan, Zhi-Song
    Yan, Yue-Song
    Miao, Zhi-Min
    Ni, Gui-Qiang
    Zhang, Hui
    Jiefangjun Ligong Daxue Xuebao/Journal of PLA University of Science and Technology (Natural Science Edition), 2010, 11 (04): : 397 - 402