An Ensemble Learning Approach with Gradient Resampling for Class-Imbalance Problems

被引:11
|
作者
Zhao, Hongke [1 ,2 ]
Zhao, Chuang [1 ,2 ]
Zhang, Xi [1 ,2 ,3 ]
Liu, Nanlin [1 ,2 ]
Zhu, Hengshu [4 ]
Liu, Qi [5 ]
Xiong, Hui [6 ]
机构
[1] Tianjin Univ, Coll Management & Econ, Tianjin 300000, Peoples R China
[2] Tianjin Univ, Lab Computat & Analyt Complex Management Syst, CACMS, Tianjin 300000, Peoples R China
[3] Beijing Inst Technol, Sch Management & Econ, Beijing 10081, Peoples R China
[4] BOSS Zhipin, Career Sci Lab, Beijing 100000, Peoples R China
[5] Univ Sci & Technol China, Anhui Prov Key Lab Big Data Anal & Applict, Hefei 230000, Anhui, Peoples R China
[6] Hong Kong Univ Sci & Technol, Guangzhou 510000, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
class-imbalance learning; ensemble learning; under-sampling strategy; gradient distribution; SMOTE; CLASSIFICATION; ALGORITHM;
D O I
10.1287/ijoc.2023.1274
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Imbalanced classification is widely referred in many real-world applications and has been extensively studied. Most existing algorithms consider alleviating the imbalance by sampling or guiding ensemble learners with punishments. The combination of ensemble learning and sampling strategy at class level has achieved great progress. Actually, specific hard examples have little benefit for model learning and even degrade the performance. From the view of identifying classification difficulty of samples, one important motivation is to design algorithms to finely equip different samples with progressive learning. Unfortunately, how to perfectly configure the sampling and learning strategies under ensemble principles at the sample level remains a research gap. In this paper, we propose a new view from the sample level rather than class level in existing studies. We design an ensemble approach in pipe with sample-level gradient resampling, that is, balanced cascade with filters (BCWF). Before that, as a preliminary exploration, we first design a hard examples mining algorithm to explore the gradient distribution of classification difficulty of samples and identify the hard examples. Specifically, BCWF uses an under-sampling strategy and a boosting manner to train T predictive classifiers and reidentify hard examples. In BCWF, moreover, we design two types of filters: the first is assembled with a hard filter (BCWF_h), whereas the second is assembled with a soft filter (BCWF_s). In each round of boosting, BCWF_h strictly removes a gradient/set of the hardest examples from both classes, whereas BCWF_s removes a larger number of harder and easy examples simultaneously for final balanced-class retention. Consequently, the well-trained T predictive classifiers can be used with two ensemble voting strategies: average probability and majority vote. To evaluate the proposed approach, we conduct intensive experiments on 10 benchmark data sets and apply our algorithms to perform default user detection on a real-world peer to peer lending data set. The experimental results fully demonstrate the effectiveness and the managerial implications of our approach when compared with 11 competitive algorithms.
引用
收藏
页码:747 / 763
页数:18
相关论文
共 50 条
  • [1] Novel resampling algorithms with maximal cliques for class-imbalance problems☆
    Wang, Long-hui
    Dai, Qi
    Du, Tony
    Chen, Li-fang
    COMPUTERS & INDUSTRIAL ENGINEERING, 2025, 199
  • [2] SWSEL: Sliding Window-based Selective Ensemble Learning for class-imbalance problems
    Dai, Qi
    Liu, Jian-wei
    Yang, Jia-Peng
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 121
  • [3] Ensemble of Cost-Sensitive Hypernetworks for Class-Imbalance Learning
    Wang, Jin
    Huang, Ping-li
    Sun, Kai-wei
    Cao, Bao-lin
    Zhao, Rui
    2013 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2013), 2013, : 1883 - 1888
  • [4] Ensemble learning via constraint projection and undersampling technique for class-imbalance problem
    Huaping Guo
    Jun Zhou
    Chang-an Wu
    Soft Computing, 2020, 24 : 4711 - 4727
  • [5] Resampling-Based Ensemble Methods for Online Class Imbalance Learning
    Wang, Shuo
    Minku, Leandro L.
    Yao, Xin
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2015, 27 (05) : 1356 - 1368
  • [6] Unsupervised Ensemble Learning for Class Imbalance Problems
    Liu, Zihan
    Wu, Dongrui
    2018 CHINESE AUTOMATION CONGRESS (CAC), 2018, : 3593 - 3600
  • [7] Ensemble learning via constraint projection and undersampling technique for class-imbalance problem
    Guo, Huaping
    Zhou, Jun
    Wu, Chang-An
    SOFT COMPUTING, 2020, 24 (07) : 4711 - 4727
  • [8] Exploratory Undersampling for Class-Imbalance Learning
    Liu, Xu-Ying
    Wu, Jianxin
    Zhou, Zhi-Hua
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2009, 39 (02): : 539 - 550
  • [9] An Ensemble Learning-Based Undersampling Technique for Handling Class-Imbalance Problem
    Sarkar, Sobhan
    Khatedi, Nikhil
    Pramanik, Anima
    Maiti, J.
    PROCEEDINGS OF ICETIT 2019: EMERGING TRENDS IN INFORMATION TECHNOLOGY, 2020, 605 : 586 - 595
  • [10] Trainable Undersampling for Class-Imbalance Learning
    Peng, Minlong
    Zhang, Qi
    Xing, Xiaoyu
    Gui, Tao
    Huang, Xuanjing
    Jiang, Yu-Gang
    Ding, Keyu
    Chen, Zhigang
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 4707 - 4714