Feature Selection Using Maximum Feature Tree Embedded with Mutual Information and Coefficient of Variation for Bird Sound Classification

被引:4
|
作者
Xu, Haifeng [1 ]
Zhang, Yan [1 ]
Liu, Jiang [1 ]
Lv, Danjv [1 ]
机构
[1] Southwest Forestry Univ, Coll Big Data & Intelligent Engn, Kunming 650224, Yunnan, Peoples R China
基金
中国国家自然科学基金;
关键词
Birds - Classification (of information) - Feature Selection - Forestry;
D O I
10.1155/2021/8872248
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The classification of bird sounds is important in ecological monitoring. Although extracting features from multiple perspectives helps to fully describe the target information, it is urgent to deal with the enormous dimension of features and the curse of dimensionality. Thus, feature selection is necessary. This paper proposes a scoring feature method named MICV (Mutual Information and Coefficient of Variation), which uses the coefficient of variation and mutual information to evaluate each feature's contribution to classification. And then, a method named ERMFT (Eliminating Redundancy Based on Maximum Feature Tree) based on two neighborhoods to eliminate redundancy to optimize features is explored. These two methods are combined as the MICV-ERMFT method to select the optimal features. Experiments are conducted to compare eight different feature selection methods with two sounds datasets of bird and crane. Results show that the MICV-ERMFT method outperforms other feature selection methods in the accuracy of the classification and is less time-consuming.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Feature Selection for Text Classification Using Mutual Information
    Sel, Ilhami
    Karci, Ali
    Hanbay, Davut
    [J]. 2019 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND DATA PROCESSING (IDAP 2019), 2019,
  • [2] Using The Maximum Mutual Information Criterion To Textural Feature Selection For Satellite Image Classification
    Kerroum, Mounir Ait
    Hammouch, Ahmed
    Aboutajdine, Driss
    Bellaachia, Abdelghani
    [J]. 2008 IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS, VOLS 1-3, 2008, : 584 - +
  • [3] Feature selection using improved mutual information for text classification
    Novovicová, J
    Malík, A
    Pudil, P
    [J]. STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, PROCEEDINGS, 2004, 3138 : 1010 - 1017
  • [4] Feature selection based on mutual information with correlation coefficient
    Hongfang Zhou
    Xiqian Wang
    Rourou Zhu
    [J]. Applied Intelligence, 2022, 52 : 5457 - 5474
  • [5] Feature selection based on mutual information with correlation coefficient
    Zhou, Hongfang
    Wang, Xiqian
    Zhu, Rourou
    [J]. APPLIED INTELLIGENCE, 2022, 52 (05) : 5457 - 5474
  • [6] Pointwise mutual information sparsely embedded feature selection
    Deng, Tingquan
    Huang, Yang
    Yang, Ge
    Wang, Changzhong
    [J]. INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2022, 151 : 251 - 270
  • [7] Feature redundancy term variation for mutual information-based feature selection
    Gao, Wanfu
    Hu, Liang
    Zhang, Ping
    [J]. APPLIED INTELLIGENCE, 2020, 50 (04) : 1272 - 1288
  • [8] Feature redundancy term variation for mutual information-based feature selection
    Wanfu Gao
    Liang Hu
    Ping Zhang
    [J]. Applied Intelligence, 2020, 50 : 1272 - 1288
  • [9] Nonlinear feature transforms using maximum mutual information
    Torkkola, K
    [J]. IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 2756 - 2761
  • [10] FCBF Feature Selection Algorithm Based on Maximum Information Coefficient
    Zhang, Li
    Yuan, Yu-Yu
    Wang, Cong
    [J]. Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2018, 41 (04): : 86 - 90