Cost-sensitive feature selection via the l2,1-norm

被引:23
|
作者
Zhao, Hong [1 ,2 ]
Yu, Shenglong [2 ]
机构
[1] Minnan Normal Univ, Sch Comp Sci, Zhangzhou 363000, Fujian, Peoples R China
[2] Minnan Normal Univ, Fujian Key Lab Granular Comp & Applicat, Zhangzhou 363000, Fujian, Peoples R China
基金
中国国家自然科学基金;
关键词
Cost-sensitive learning; Feature selection; Misclassification cost; Test cost; INFORMATION;
D O I
10.1016/j.ijar.2018.10.017
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An essential step in data mining and machine learning is selecting a useful feature subset from the high-dimensional feature space. Many existing feature selection algorithms only consider precision, but do not consider error types and test cost. In this paper, we use the l(2,1)-norm to propose a cost-sensitive embedded feature selection algorithm that minimizes the total cost rather than maximizing accuracy. The algorithm is a cost sensitive feature selection algorithm with joint l(2,1)-norm minimization of the loss function with misclassification costs. The l(2,1)-norm based loss function with misclassification costs is robust to outliers. We also add an orthogonal constraint term to guarantee that each selected feature is independent. The proposed algorithm simultaneously takes into account both test costs and misclassification costs. Finally, an iterative updating algorithm is provided using the objective function that makes cost-sensitive feature selection more efficient. The cost-sensitive feature selection algorithm is more realistic than existing feature selection algorithms. Extensive experimental results on publicly available datasets demonstrate that the proposed algorithm is effective, can select a low-cost subset and achieve better performance than other feature selection algorithms in real-world applications. (C) 2018 Elsevier Inc. All rights reserved.
引用
收藏
页码:25 / 37
页数:13
相关论文
共 50 条
  • [1] Robust feature selection via l2,1-norm in finite mixture of regression
    Li, Xiangrui
    Zhu, Dongxiao
    PATTERN RECOGNITION LETTERS, 2018, 108 : 15 - 22
  • [2] Unsupervised maximum margin feature selection via L2,1-norm minimization
    Shizhun Yang
    Chenping Hou
    Feiping Nie
    Yi Wu
    Neural Computing and Applications, 2012, 21 : 1791 - 1799
  • [3] Unsupervised Discriminative Feature Selection in a Kernel Space via L2,1-Norm Minimization
    Liu, Yang
    Wang, Yizhou
    2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 1205 - 1208
  • [4] Robust discriminant feature selection via joint L2,1-norm distance minimization and maximization
    Yang, Zhangjing
    Ye, Qiaolin
    Chen, Qiao
    Ma, Xu
    Fu, Liyong
    Yang, Guowei
    Yan, He
    Liu, Fan
    KNOWLEDGE-BASED SYSTEMS, 2020, 207
  • [5] Discriminative Feature Selection via Joint Trace Ratio Criterion and l2,1-norm Regularization
    Jiang, Zhang
    Zhao, Mingbo
    Kong, Weijian
    2018 IEEE SYMPOSIUM ON PRODUCT COMPLIANCE ENGINEERING - ASIA 2018 (IEEE ISPCE-CN 2018), 2018, : 27 - 32
  • [6] Fast unsupervised feature selection with anchor graph and l2,1-norm regularization
    Hu, Haojie
    Wang, Rong
    Nie, Feiping
    Yang, Xiaojun
    Yu, Weizhong
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (17) : 22099 - 22113
  • [7] Unsupervised and Supervised Feature Selection for Incomplete Data via L2,1-Norm and Reconstruction Error Minimization
    Cai, Jun
    Fan, Linge
    Xu, Xin
    Wu, Xinrong
    APPLIED SCIENCES-BASEL, 2022, 12 (17):
  • [8] Characteristic Gene Selection via L2,1-norm Sparse Principal Component Analysis
    Lu, Yao
    Ga, Ying-Lian
    Liu, Jin-Xing
    Wen, Chang-Gang
    Wang, Ya-Xuan
    Yu, Jiguo
    2016 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2016, : 1828 - 1833
  • [9] Unsupervised maximum margin feature selection via L 2,1-norm minimization
    Yang, Shizhun
    Hou, Chenping
    Nie, Feiping
    Wu, Yi
    NEURAL COMPUTING & APPLICATIONS, 2012, 21 (07): : 1791 - 1799
  • [10] l2,1-norm minimization based negative label relaxation linear regression for feature selection
    Peng, Yali
    Sehdev, Paramjit
    Liu, Shigang
    Lie, Jun
    Wang, Xili
    PATTERN RECOGNITION LETTERS, 2018, 116 : 170 - 178