Bare bones particle swarm optimization with adaptive chaotic jump for feature selection in classification

被引:0
|
作者
Qiu, Chenye [1 ]
机构
[1] Nanjing Univ Posts & Telecommun, Sch Internet Things, 66 Xinmofan Rd, Nanjing 210003, Jiangsu, Peoples R China
关键词
feature selection; bare bones particle swarm; adaptive chaotic jump; global best updating mechanism; FEATURE SUBSET-SELECTION; ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection (FS) is a crucial data pre-processing process in classification problems. It aims to reduce the dimensionality of the problem by eliminating irrelevant or redundant features while achieve similar or even higher classification accuracy than using all the features. As a variant of particle swarm optimization (PSO), Bare bones particle swarm optimization (BBPSO) is a simple but very powerful optimizer. However, it also suffers from premature convergence like other PSO algorithms, especially in high-dimensional optimization problems. In order to improve its performance in FS problems, this paper proposes a novel BBPSO based FS method called BBPSO-ACJ. An adaptive chaotic jump strategy is designed to help the stagnated particles make a large change in their searching trajectory. It can enrich the search behavior of BBPSO and prevent the particles from being trapped into local attractors. A new global best updating mechanism is employed to reduce the size of obtained feature subset. The proposed BBPSO-ACJ is compared with eight evolutionary computation (EC) based wrapper methods and two filter methods on nine benchmark datasets with different number of dimensions and instances. The experimental results indicate that the proposed method can select the most discriminative features from the entire feature set and achieve significantly better classification performance than other comparative methods.
引用
收藏
页码:1 / 14
页数:14
相关论文
共 50 条
  • [1] Bare bones particle swarm optimization with adaptive chaotic jump for feature selection in classification
    Qiu C.
    [J]. International Journal of Computational Intelligence Systems, 2018, 11 (1) : 1 - 14
  • [2] Adaptive Bare Bones Particle Swarm Optimization for Feature Selection
    Li, Ce
    Hu, Haidong
    Gao, Hao
    Wang, Baoyun
    [J]. PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 1594 - 1599
  • [3] Feature selection algorithm based on bare bones particle swarm optimization
    Zhang, Yong
    Gong, Dunwei
    Hu, Ying
    Zhang, Wanqiu
    [J]. NEUROCOMPUTING, 2015, 148 : 150 - 157
  • [4] Feature selection using bare-bones particle swarm optimization with mutual information
    Song, Xian-fang
    Zhang, Yong
    Gong, Dun-wei
    Sun, Xiao-yan
    [J]. PATTERN RECOGNITION, 2021, 112
  • [5] Chaotic Maps in Binary Particle Swarm Optimization for Feature Selection
    Yang, Cheng-San
    Chuang, Li-Yeh
    Li, Jung-Chike
    Yang, Cheng-Hong
    [J]. 2008 IEEE CONFERENCE ON SOFT COMPUTING IN INDUSTRIAL APPLICATIONS SMCIA/08, 2009, : 107 - +
  • [6] Feature Selection for Classification Using Particle Swarm Optimization
    Brezocnik, Lucija
    [J]. 17TH IEEE INTERNATIONAL CONFERENCE ON SMART TECHNOLOGIES - IEEE EUROCON 2017 CONFERENCE PROCEEDINGS, 2017, : 966 - 971
  • [7] A twinning bare bones particle swarm optimization algorithm
    Guo, Jia
    Shi, Binghua
    Yan, Ke
    Di, Yi
    Tang, Jianyu
    Xiao, Haiyang
    Sato, Yuji
    [J]. PLOS ONE, 2022, 17 (05):
  • [8] A Hierarchical Bare Bones Particle Swarm Optimization Algorithm
    Guo, Jia
    Sato, Yuji
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 1936 - 1941
  • [9] Different implementations of bare bones particle swarm optimization
    Zhang, Zhen
    Pan, Zai-Ping
    Pan, Xiao-Hong
    [J]. Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2015, 49 (07): : 1350 - 1357
  • [10] A Study of Collapse in Bare Bones Particle Swarm Optimization
    Blackwell, Tim
    [J]. IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2012, 16 (03) : 354 - 372