Three Heads Are Better than One: Complementary Experts for Long-Tailed Semi-supervised Learning

被引:0
|
作者
Ma, Chengcheng [1 ,2 ]
Elezi, Ismail [3 ]
Deng, Jiankang [3 ]
Dong, Weiming [1 ]
Xu, Changsheng [1 ]
机构
[1] Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing, Peoples R China
[3] Huawei Noahs Ark Lab, London, England
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We address the challenging problem of Long-Tailed Semi-Supervised Learning (LTSSL) where labeled data exhibit imbalanced class distribution and unlabeled data follow an unknown distribution. Unlike in balanced SSL, the generated pseudo-labels are skewed towards head classes, intensifying the training bias. Such a phenomenon is even amplified as more unlabeled data will be mislabeled as head classes when the class distribution of labeled and unlabeled datasets are mismatched. To solve this problem, we propose a novel method named ComPlementary Experts (CPE). Specifically, we train multiple experts to model various class distributions, each of them yielding high-quality pseudo-labels within one form of class distribution. Besides, we introduce Classwise Batch Normalization for CPE to avoid performance degradation caused by feature distribution mismatch between head and non-head classes. CPE achieves state-of-the-art performances on CIFAR-10-LT, CIFAR-100-LT, and STL-10-LT dataset benchmarks. For instance, on CIFAR-10-LT, CPE improves test accuracy by over >2.22% compared to baselines. Code is available at https://github.com/machengcheng2016/CPE-LTSSL.
引用
收藏
页码:14229 / 14237
页数:9
相关论文
共 50 条
  • [31] Three heads are better than one: cooperative learning brains wire together when a consensus is reached
    Pan, Yafeng
    Cheng, Xiaojun
    Hu, Yi
    CEREBRAL CORTEX, 2023, 33 (04) : 1155 - 1169
  • [32] Life-long semi-supervised learning: Continuation of both learning and recognition
    Kamiya, Youki
    Ishii, Toshiaki
    Hasegawa, Osamu
    2007 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE IN IMAGE AND SIGNAL PROCESSING, 2007, : 403 - +
  • [33] Semi-supervised one-pass multi-view learning
    Changming Zhu
    Zhe Wang
    Rigui Zhou
    Lai Wei
    Xiafen Zhang
    Yi Ding
    Neural Computing and Applications, 2019, 31 : 8117 - 8134
  • [34] Semi-supervised one-pass multi-view learning
    Zhu, Changming
    Wang, Zhe
    Zhou, Rigui
    Wei, Lai
    Zhang, Xiafen
    Ding, Yi
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (11): : 8117 - 8134
  • [35] Long-tailed visual classification based on supervised contrastive learning with multi-view fusion
    Zeng, Liang
    Feng, Zheng
    Chen, Jia
    Wang, Shanshan
    KNOWLEDGE-BASED SYSTEMS, 2024, 301
  • [36] Mutual Learning of Complementary Networks via Residual Correction for Improving Semi-Supervised Classification
    Wu, Si
    Li, Jichang
    Liu, Cheng
    Yu, Zhiwen
    Wong, Hau-San
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 6493 - 6502
  • [37] A little labeling goes a long way: Semi-supervised learning in infancy
    LaTourrette, Alexander
    Waxman, Sandra R.
    DEVELOPMENTAL SCIENCE, 2019, 22 (01)
  • [38] Three Heads Are Better Than One: Organizational Changes in Collection Management Leadership
    Bishop, Barbara A.
    Grabowsky, Adelia B.
    Weisbrod, Liza
    WHERE DO WE GO FROM HERE?, 2015, : 462 - 467
  • [39] Building One-Shot Semi-Supervised (BOSS) Learning Up to Fully Supervised Performance
    Smith, Leslie N.
    Conovaloff, Adam
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2022, 5
  • [40] Two heads are better than one: both complementary and synchronous strategies facilitate joint action
    Masumoto, Junya
    Inui, Nobuyuki
    JOURNAL OF NEUROPHYSIOLOGY, 2013, 109 (05) : 1307 - 1314