DUEL: Duplicate Elimination on Active Memory for Self-Supervised Class-Imbalanced Learning

被引:0
|
作者
Choi, Won-Seok [1 ]
Lee, Hyundo [1 ]
Han, Dong-Sig [1 ]
Park, Junseok [1 ]
Koo, Heeyeon [2 ]
Zhang, Byoung-Tak [1 ,3 ]
机构
[1] Seoul Natl Univ, Seoul, South Korea
[2] Yonsei Univ, Seoul, South Korea
[3] AI Inst Seoul Natl Univ AIIS, Seoul, South Korea
关键词
WORKING-MEMORY;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent machine learning algorithms have been developed using well-curated datasets, which often require substantial cost and resources. On the other hand, the direct use of raw data often leads to overfitting towards frequently occurring class information. To address class imbalances cost-efficiently, we propose an active data filtering process during self-supervised pre-training in our novel framework, Duplicate Elimination (DUEL). This framework integrates an active memory inspired by human working memory and introduces distinctiveness information, which measures the diversity of the data in the memory, to optimize both the feature extractor and the memory. The DUEL policy, which replaces the most duplicated data with new samples, aims to enhance the distinctiveness information in the memory and thereby mitigate class imbalances. We validate the effectiveness of the DUEL framework in class-imbalanced environments, demonstrating its robustness and providing reliable results in downstream tasks. We also analyze the role of the DUEL policy in the training process through various metrics and visualizations.
引用
收藏
页码:11579 / 11587
页数:9
相关论文
共 50 条
  • [21] Self-supervised class-balanced active learning with uncertainty-mastery fusion
    Wu, Yan-Xue
    Min, Fan
    Chen, Gong-Suo
    Shen, Shao-Peng
    Wen, Zuo-Cheng
    Zhou, Xiang-Bing
    KNOWLEDGE-BASED SYSTEMS, 2024, 300
  • [22] Rethinking the Value of Labels for Improving Class-Imbalanced Learning
    Yang, Yuzhe
    Xu, Zhi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [23] Self-Supervised Learning by Estimating Twin Class Distribution
    Wang, Feng
    Kong, Tao
    Zhang, Rufeng
    Liu, Huaping
    Li, Hang
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 2228 - 2236
  • [24] Self-supervised learning of class embeddings from video
    Wiles, Olivia
    Koepke, A. Sophia
    Zisserman, Andrew
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 3019 - 3027
  • [25] Combining Self-supervised Learning and Active Learning for Disfluency Detection
    Wang, Shaolei
    Wang, Zhongyuan
    Che, Wanxiang
    Zhao, Sendong
    Liu, Ting
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2022, 21 (03)
  • [26] Memory Bank Clustering for Self-supervised Contrastive Learning
    Hao, Yiqing
    An, Gaoyun
    Ruan, Qiuqi
    IMAGE AND GRAPHICS TECHNOLOGIES AND APPLICATIONS, IGTA 2021, 2021, 1480 : 132 - 144
  • [27] Self-Supervised Reinforcement Learning for Active Object Detection
    Fang, Fen
    Liang, Wenyu
    Wu, Yan
    Xu, Qianli
    Lim, Joo-Hwee
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (04): : 10224 - 10231
  • [28] Informative Nodes Mining for Class-Imbalanced Representation Learning
    Zhou, Mengting
    Gong, Zhiguo
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, : 1 - 11
  • [29] Learning Fairly With Class-Imbalanced Data for Interference Coordination
    Guo, Jia
    Xu, Zhaoqi
    Yang, Chenyang
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2021, 70 (07) : 7176 - 7181
  • [30] Class-Imbalanced Deep Learning via a Class-Balanced Ensemble
    Chen, Zhi
    Duan, Jiang
    Kang, Li
    Qiu, Guoping
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (10) : 5626 - 5640