Naive Bayes-Guided Bat Algorithm for Feature Selection

被引:44
|
作者
Taha, Ahmed Majid [1 ]
Mustapha, Aida [2 ]
Chen, Soong-Der [3 ]
机构
[1] Univ Tenaga Nas, Coll Grad Studies, Kajang 43000, Selangor, Malaysia
[2] Univ Putra Malaysia, Fac Comp Sci & Informat Technol, Serdang 43400, Selangor, Malaysia
[3] Univ Tenaga Nas, Coll Informat Technol, Kajang 43000, Selangor, Malaysia
来源
关键词
OPTIMIZATION APPROACH; ATTRIBUTE REDUCTION; ROUGH; CLASSIFIER;
D O I
10.1155/2013/325973
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
When the amount of data and information is said to double in every 20 months or so, feature selection has become highly important and beneficial. Further improvements in feature selection will positively affect a wide array of applications in fields such as pattern recognition, machine learning, or signal processing. Bio-inspired method called Bat Algorithm hybridized with a Naive Bayes classifier has been presented in this work. The performance of the proposed feature selection algorithm was investigated using twelve benchmark datasets from different domains and was compared to three other well-known feature selection algorithms. Discussion focused on four perspectives: number of features, classification accuracy, stability, and feature generalization. The results showed that BANB significantly outperformed other algorithms in selecting lower number of features, hence removing irrelevant, redundant, or noisy features while maintaining the classification accuracy. BANB is also proven to be more stable than other methods and is capable of producing more general feature subsets.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Feature selection for optimizing the Naive Bayes algorithm
    Winarti, Titin
    Vydia, Vensy
    [J]. ENGINEERING, INFORMATION AND AGRICULTURAL TECHNOLOGY IN THE GLOBAL DIGITAL REVOLUTION, 2020, : 47 - 51
  • [2] HYBRID FEATURE SELECTION APPROACH USING BACTERIAL FORAGING ALGORITHM GUIDED BY NAIVE BAYES CLASSIFICATION
    Mittal, Divya
    Bala, Manju
    [J]. 2017 8TH INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATION AND NETWORKING TECHNOLOGIES (ICCCNT), 2017,
  • [3] Text Classification Based on Naive Bayes Algorithm with Feature Selection
    Chen, Zhenguo
    Shi, Guang
    Wang, Xiaoju
    [J]. INFORMATION-AN INTERNATIONAL INTERDISCIPLINARY JOURNAL, 2012, 15 (10): : 4255 - 4260
  • [4] Naive Feature Selection: Sparsity in Naive Bayes
    Askari, Armin
    d'Aspremont, Alex
    El Ghaoui, Laurent
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 1813 - 1821
  • [5] Feature selection for text classification with Naive Bayes
    Chen, Jingnian
    Huang, Houkuan
    Tian, Shengfeng
    Qu, Youli
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (03) : 5432 - 5435
  • [6] Learning naive Bayes for probability estimation by feature selection
    Jiang, Liangxiao
    Zhang, Harry
    [J]. ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2006, 4013 : 503 - 514
  • [7] Feature selection for unbalanced class distribution and Naive Bayes
    Mladenic, D
    Grobelnik, M
    [J]. MACHINE LEARNING, PROCEEDINGS, 1999, : 258 - 267
  • [8] Naive Feature Selection: A Nearly Tight Convex Relaxation for Sparse Naive Bayes
    Askari, Armin
    d'Aspremont, Alexandre
    El Ghaoui, Laurent
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 2024, 49 (01) : 278 - 296
  • [9] Comparison of Naive Bayes and Decision Tree on Feature Selection Using Genetic Algorithm for Classification Problem
    Rahmadani, S.
    Dongoran, A.
    Zarlis, M.
    Zakarias
    [J]. 2ND INTERNATIONAL CONFERENCE ON COMPUTING AND APPLIED INFORMATICS 2017, 2018, 978
  • [10] Feature Selection Model using Naive Bayes ML Algorithm for WSN Intrusion Detection System
    Jeevaraj, Deepa
    Vijayan, T.
    Karthik, B.
    Sriram, M.
    [J]. INTERNATIONAL JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING SYSTEMS, 2023, 14 (02) : 179 - 185