Enhancing long-tailed classification via multi-strategy weighted experts with hybrid distillation

被引:0
|
作者
Zeng, Wu [1 ]
Xiao, Zhengying [1 ]
机构
[1] Putian Univ, Engn Training Ctr, Putian 351100, Peoples R China
关键词
Long-tailed classification; Imbalanced learning; Image classification; Data augmentation; Multi-expert; Hybrid distillation;
D O I
10.1007/s00530-024-01635-y
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the real world, most datasets commonly exhibit a long-tailed distribution phenomenon. Although multi-expert approaches have achieved commendable results, there is still room for improvement. To enhance model performance in long-tailed datasets, we propose PEHD (performance-weighted experts with hybrid distillation), a strategy that integrates various data augmentation techniques with a multi-expert system and hybrid distillation methods. We first use different data augmentation strategies (strong image enhancement and weak image enhancement) to augment the original image, in order to more fully utilize the limited sample. Second, we introduce multi-strategy weighted multi-expert to encourage each expert to focus on categories of different frequencies. Furthermore, in order to reduce the variance of the model and improve its performance, we designed a learning framework that combines self-distillation and mutual distillation. By combining knowledge transfer between teacher models and student models within the same and different categories. Finally, based on the characteristics of the data volume in different categories, different threshold filtering strategies are used. By using this strategy, we can further reduce the negative impact of extremely low-quality small batch samples on model training. The experimental results indicate that the PEHD strategy has reached an advanced level on multiple public datasets. Especially in the CIFAR-100-LT dataset, when the imbalance rate is 100, the classification accuracy reaches 54.11%.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Residual diverse ensemble for long-tailed multi-label text classification
    Jiangxin SHI
    Tong WEI
    Yufeng LI
    Science China(Information Sciences), 2024, 67 (11) : 92 - 105
  • [22] Residual diverse ensemble for long-tailed multi-label text classification
    Shi, Jiangxin
    Wei, Tong
    Li, Yufeng
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (11)
  • [23] Exploring Contrastive Learning for Long-Tailed Multi-label Text Classification
    Audibert, Alexandre
    Gauffre, Aurelien
    Amini, Massih-Reza
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT VII, ECML PKDD 2024, 2024, 14947 : 245 - 261
  • [24] Hierarchical classification of data with long-tailed distributions via global and local granulation
    Zhao, Hong
    Guo, Shunxin
    Lin, Yaojin
    INFORMATION SCIENCES, 2021, 581 : 536 - 552
  • [25] Learning Multi-Expert Distribution Calibration for Long-Tailed Video Classification
    Hu, Yufan
    Gao, Junyu
    Xu, Changsheng
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 555 - 567
  • [26] Distributionally Robust Loss for Long-Tailed Multi-label Image Classification
    Lin, Dekun
    Peng, Tailai
    Chen, Rui
    Xie, Xinran
    Qin, Xiaolin
    Cui, Zhe
    COMPUTER VISION - ECCV 2024, PT XXXIII, 2025, 15091 : 417 - 433
  • [27] Probability Guided Loss for Long-Tailed Multi-Label Image Classification
    Lin, Dekun
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 2, 2023, : 1577 - 1585
  • [28] Effect of Stage Training for Long-Tailed Multi-Label Image Classification
    Yamagishi, Yosuke
    Hanaoka, Shohei
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 2713 - 2720
  • [29] KDTM: Multi-Stage Knowledge Distillation Transfer Model for Long-Tailed DGA Detection
    Fan, Baoyu
    Ma, Han
    Liu, Yue
    Yuan, Xiaochen
    Ke, Wei
    MATHEMATICS, 2024, 12 (05)
  • [30] S3H: Long-tailed classification via spatial constraint sampling, scalable network, and hybrid task
    Zhao, Wenyi
    Li, Wei
    Tian, Yongqin
    Hu, Enwen
    Liu, Wentao
    Zhang, Bin
    Zhang, Weidong
    Yang, Huihua
    NEURAL NETWORKS, 2025, 185