SWSEL: Sliding Window-based Selective Ensemble Learning for class-imbalance problems

被引:7
|
作者
Dai, Qi [1 ]
Liu, Jian-wei [1 ,3 ]
Yang, Jia-Peng [2 ]
机构
[1] China Univ Petr, Coll Informat Sci & Engn, Dept Automat, Beijing, Peoples R China
[2] North China Univ Sci & Technol, Coll Sci, Tangshan, Peoples R China
[3] China Univ Petr, 260 Mailbox, Beijing 102249, Peoples R China
关键词
Sliding window; Selective ensemble learning; Ensemble learning; Distance metric; Imbalanced data; NETWORKS; ALGORITHMS; DIVERSITY; ACCURACY;
D O I
10.1016/j.engappai.2023.105959
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
For class-imbalance problems, traditional supervised learning algorithms tend to favor majority instances (also called negative instances). Therefore, it is difficult for them to accurately identify the minority instances (also called positive instances). Ensemble learning is a common method to solve the class-imbalance problem. They build multiple classifier systems on the training dataset to improve the recognition accuracy of minority instances. Sliding window is a commonly used method for processing data stream. Few researchers have used sliding windows to select majority instances and construct ensemble learning models. Traditional ensemble learning methods use some or all of the majority instances for modeling by oversampling or undersampling. However, they also inherit the drawbacks of the preprocessing methods. Therefore, in this paper, we try to use similarity mapping to construct pseudo-sequences of majority instances. Then, according to the sliding window idea, we fully use all existing majority instances, and a novel sliding window-based selective ensemble learning method (SWSEL) is proposed to deal with the class-imbalance problem. This method uses the idea of distance alignment in multi-view alignment to align the centers of the minority instances with the majority instances, and slide to select the majority instances on the sequence of pseudo-majority instances. In addition, to prevent too many classifiers from leading to long running times, we use distance metric to select a certain number of base classifiers to build the final ensemble learning model. Extensive experimental results on various real-world datasets show that using SVM, MLP and RF as the base classifier, SWSEL achieves a statistically significant performance improvement on two evaluation metrics, AUC and G-mean, compared to state-of-the-art methods.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] An Ensemble Learning Approach with Gradient Resampling for Class-Imbalance Problems
    Zhao, Hongke
    Zhao, Chuang
    Zhang, Xi
    Liu, Nanlin
    Zhu, Hengshu
    Liu, Qi
    Xiong, Hui
    [J]. INFORMS JOURNAL ON COMPUTING, 2023, 35 (04) : 747 - 763
  • [2] An Ensemble Learning-Based Undersampling Technique for Handling Class-Imbalance Problem
    Sarkar, Sobhan
    Khatedi, Nikhil
    Pramanik, Anima
    Maiti, J.
    [J]. PROCEEDINGS OF ICETIT 2019: EMERGING TRENDS IN INFORMATION TECHNOLOGY, 2020, 605 : 586 - 595
  • [3] Ensemble of Cost-Sensitive Hypernetworks for Class-Imbalance Learning
    Wang, Jin
    Huang, Ping-li
    Sun, Kai-wei
    Cao, Bao-lin
    Zhao, Rui
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2013), 2013, : 1883 - 1888
  • [4] Class-imbalance Learning based Discriminant Analysis
    Jing, Xiaoyuan
    Lan, Chao
    Li, Min
    Yao, Yongfang
    Zhang, David
    Yang, Jingyu
    [J]. 2011 FIRST ASIAN CONFERENCE ON PATTERN RECOGNITION (ACPR), 2011, : 545 - 549
  • [5] Ensemble learning via constraint projection and undersampling technique for class-imbalance problem
    Huaping Guo
    Jun Zhou
    Chang-an Wu
    [J]. Soft Computing, 2020, 24 : 4711 - 4727
  • [6] Unsupervised Ensemble Learning for Class Imbalance Problems
    Liu, Zihan
    Wu, Dongrui
    [J]. 2018 CHINESE AUTOMATION CONGRESS (CAC), 2018, : 3593 - 3600
  • [7] Ensemble learning via constraint projection and undersampling technique for class-imbalance problem
    Guo, Huaping
    Zhou, Jun
    Wu, Chang-An
    [J]. SOFT COMPUTING, 2020, 24 (07) : 4711 - 4727
  • [8] Trainable Undersampling for Class-Imbalance Learning
    Peng, Minlong
    Zhang, Qi
    Xing, Xiaoyu
    Gui, Tao
    Huang, Xuanjing
    Jiang, Yu-Gang
    Ding, Keyu
    Chen, Zhigang
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 4707 - 4714
  • [9] Exploratory Undersampling for Class-Imbalance Learning
    Liu, Xu-Ying
    Wu, Jianxin
    Zhou, Zhi-Hua
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2009, 39 (02): : 539 - 550
  • [10] Graph-Based Class-Imbalance Learning With Label Enhancement
    Du, Guodong
    Zhang, Jia
    Jiang, Min
    Long, Jinyi
    Lin, Yaojin
    Li, Shaozi
    Tan, Kay Chen
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 6081 - 6095