Enhancing long-tailed classification via multi-strategy weighted experts with hybrid distillation

被引:0
|
作者
Zeng, Wu [1 ]
Xiao, Zhengying [1 ]
机构
[1] Putian Univ, Engn Training Ctr, Putian 351100, Peoples R China
关键词
Long-tailed classification; Imbalanced learning; Image classification; Data augmentation; Multi-expert; Hybrid distillation;
D O I
10.1007/s00530-024-01635-y
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the real world, most datasets commonly exhibit a long-tailed distribution phenomenon. Although multi-expert approaches have achieved commendable results, there is still room for improvement. To enhance model performance in long-tailed datasets, we propose PEHD (performance-weighted experts with hybrid distillation), a strategy that integrates various data augmentation techniques with a multi-expert system and hybrid distillation methods. We first use different data augmentation strategies (strong image enhancement and weak image enhancement) to augment the original image, in order to more fully utilize the limited sample. Second, we introduce multi-strategy weighted multi-expert to encourage each expert to focus on categories of different frequencies. Furthermore, in order to reduce the variance of the model and improve its performance, we designed a learning framework that combines self-distillation and mutual distillation. By combining knowledge transfer between teacher models and student models within the same and different categories. Finally, based on the characteristics of the data volume in different categories, different threshold filtering strategies are used. By using this strategy, we can further reduce the negative impact of extremely low-quality small batch samples on model training. The experimental results indicate that the PEHD strategy has reached an advanced level on multiple public datasets. Especially in the CIFAR-100-LT dataset, when the imbalance rate is 100, the classification accuracy reaches 54.11%.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts
    Yi S.-Y.
    Mao Z.
    Ju W.
    Zhou Y.-D.
    Liu L.
    Luo X.
    Zhang M.
    IEEE Transactions on Big Data, 2023, 9 (06): : 1683 - 1696
  • [2] Joint weighted knowledge distillation and multi-scale feature distillation for long-tailed recognition
    Yiru He
    Shiqian Wang
    Junyang Yu
    Chaoyang Liu
    Xin He
    Han Li
    International Journal of Machine Learning and Cybernetics, 2024, 15 : 1647 - 1661
  • [3] Joint weighted knowledge distillation and multi-scale feature distillation for long-tailed recognition
    He, Yiru
    Wang, Shiqian
    Yu, Junyang
    Liu, Chaoyang
    He, Xin
    Li, Han
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (04) : 1647 - 1661
  • [4] Multiple Contrastive Experts for long-tailed image classification
    Wang, Yandan
    Sun, Kaiyin
    Guo, Chenqi
    Zhong, Shiwei
    Liu, Huili
    Ma, Yinglong
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 255
  • [5] MEID: Mixture-of-Experts with Internal Distillation for Long-Tailed Video Recognition
    Li, Xinjie
    Xu, Huijuan
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 2, 2023, : 1451 - 1459
  • [6] Nonlocal Hybrid Network for Long-tailed Image Classification
    Liang, Rongjiao
    Zhang, Shichao
    Zhang, Wenzhen
    Zhang, Guixian
    Tang, Jinyun
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2024, 20 (04)
  • [7] Learning Knowledge-diverse Experts for Long-tailed Graph Classification
    Mao, Zhengyang
    Ju, Wei
    Yi, Siyu
    Wang, Yifan
    Xiao, Zhiping
    Long, Qingqing
    Yin, Nan
    Liu, Xin wang
    Zhang, Ming
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2025, 19 (02)
  • [8] MDCS: More Diverse Experts with Consistency Self-distillation for Long-tailed Recognition
    Zhao, Qihao
    Jiang, Chen
    Hu, Wei
    Zhang, Fan
    Liu, Jun
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11563 - 11574
  • [9] Consensus and Risk Aversion Learning in Ensemble of Multiple Experts for Long-Tailed Classification
    Ha, Taegil
    Choi, Jin Young
    IEEE ACCESS, 2024, 12 : 97883 - 97892
  • [10] SAR Image Classification with Knowledge Distillation and Class Balancing for Long-Tailed Distributions
    Jahan, Chowdhury Sadman
    Savakis, Andreas
    Blasch, Erik
    2022 IEEE 14TH IMAGE, VIDEO, AND MULTIDIMENSIONAL SIGNAL PROCESSING WORKSHOP (IVMSP), 2022,