Transfer and share: semi-supervised learning from long-tailed data

被引:6
|
作者
Wei, Tong [1 ]
Liu, Qian-Yu [2 ]
Shi, Jiang-Xin [2 ]
Tu, Wei-Wei [3 ]
Guo, Lan-Zhe [2 ]
机构
[1] Southeast Univ, Sch Comp Sci & Engn, Nanjing 210096, Peoples R China
[2] Nanjing Univ, Natl Key Lab Novel Software Technol, Nanjing 210023, Peoples R China
[3] 4Paradigm Inc, Beijing 100000, Peoples R China
关键词
Long-tailed learning; Semi-supervised learning; Pseudo-label distribution; Logit transformation;
D O I
10.1007/s10994-022-06247-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Long-Tailed Semi-Supervised Learning (LTSSL) aims to learn from class-imbalanced data where only a few samples are annotated. Existing solutions typically require substantial cost to solve complex optimization problems, or class-balanced undersampling which can result in information loss. In this paper, we present the TRAS (TRAnsfer and Share) to effectively utilize long-tailed semi-supervised data. TRAS transforms the imbalanced pseudo-label distribution of a traditional SSL model via a delicate function to enhance the supervisory signals for minority classes. It then transfers the distribution to a target model such that the minority class will receive significant attention. Interestingly, TRAS shows that more balanced pseudo-label distribution can substantially benefit minority-class training, instead of seeking to generate accurate pseudo-labels as in previous works. To simplify the approach, TRAS merges the training of the traditional SSL model and the target model into a single procedure by sharing the feature extractor, where both classifiers help improve the representation learning. According to extensive experiments, TRAS delivers much higher accuracy than state-of-the-art methods in the entire set of classes as well as minority classes.
引用
收藏
页码:1725 / 1742
页数:18
相关论文
共 50 条
  • [31] Distributed Semi-Supervised Learning With Missing Data
    Xu, Zhen
    Liu, Ying
    Li, Chunguang
    IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (12) : 6165 - 6178
  • [32] Semi-Supervised Transfer Learning Method for Bearing Fault Diagnosis with Imbalanced Data
    Zong, Xia
    Yang, Rui
    Wang, Hongshu
    Du, Minghao
    You, Pengfei
    Wang, Su
    Su, Hao
    MACHINES, 2022, 10 (07)
  • [33] Semi-supervised Learning
    Adams, Niall
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES A-STATISTICS IN SOCIETY, 2009, 172 : 530 - 530
  • [34] On semi-supervised learning
    A. Cholaquidis
    R. Fraiman
    M. Sued
    TEST, 2020, 29 : 914 - 937
  • [35] On semi-supervised learning
    Cholaquidis, A.
    Fraiman, R.
    Sued, M.
    TEST, 2020, 29 (04) : 914 - 937
  • [36] Extending Semi-supervised Learning Methods for Inductive Transfer Learning
    Shi, Yuan
    Lan, Zhenzhong
    Liu, Wei
    Bi, Wei
    2009 9TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, 2009, : 483 - +
  • [37] Semi-Supervised Triply Robust Inductive Transfer Learning
    Cai, Tianxi
    Li, Mengyan
    Liu, Molei
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024,
  • [38] Learning from Testing Data: A New View of Incremental Semi-Supervised Learning
    Cao, Yuan
    He, Haibo
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 2872 - 2878
  • [39] A New Transfer Learning Algorithm in Semi-Supervised Setting
    Sanodiya, Rakesh Kumar
    Mathew, Jimson
    Saha, Sriparna
    Thalakottur, Michelle Davies
    IEEE ACCESS, 2019, 7 : 42956 - 42967
  • [40] The Perils of Learning From Unlabeled Data: Backdoor Attacks on Semi-supervised Learning
    Shejwalkar, Virat
    Lyu, Lingjuan
    Houmansadr, Amir
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 4707 - 4717