A discriminative multi-class feature selection method via weighted l2,1 -norm and Extended Elastic Net

被引:10
|
作者
Chen, Si-Bao [1 ]
Zhang, Ying [1 ]
Ding, Chris H. Q. [2 ]
Zhou, Zhi-Li [3 ]
Luo, Bin [1 ]
机构
[1] Anhui Univ, Sch Comp Sci & Technol, Hefei 230601, Anhui, Peoples R China
[2] Univ Texas Arlington, Dept Comp Sci & Engn, Arlington, TX 76019 USA
[3] Nanjing Univ Informat Sci & Technol, Sch Comp & Software, Nanjing 210044, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
l(2,1)-norm; Elastic Net; Sparse minimization; Multi-class; Feature selection; MOLECULAR CLASSIFICATION; GENE SELECTION; REGRESSION; CANCER; FACE; REGULARIZATION; INFORMATION; CARCINOMAS; PREDICTION; FRAMEWORK;
D O I
10.1016/j.neucom.2017.09.055
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection has playing an important role in many pattern recognition and machine learning applications, where meaningful features are desired to be extracted from high dimensional raw data and noisy ones are expected to be eliminated. l(2,1)-norm regularization based Robust Feature Selection (RFS) has extracted a lot of attention due to its efficiency and high performance of joint sparsity. In this paper, we propose a more general framework for robust and discriminative multi-class feature selection. Four types of weighting, which are based on correlation information between features and labels, are adopted to strengthen the discriminative performance of l(2,1)-norm joint sparsity. F-norm regularization, which is extended from multi-class Elastic Net, is added to improve the stability of the method. An efficient algorithm and its corresponding convergence proof are provided. Experimental results on several twoclass and multi-class datasets are performed to verify the effectiveness of the proposed feature selection method. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:1140 / 1149
页数:10
相关论文
共 50 条
  • [1] Robust multi-class feature selection via l2,0-norm regularization minimization
    Sun, Zhenzhen
    Yu, Yuanlong
    INTELLIGENT DATA ANALYSIS, 2022, 26 (01) : 57 - 73
  • [2] Unsupervised Discriminative Feature Selection in a Kernel Space via L2,1-Norm Minimization
    Liu, Yang
    Wang, Yizhou
    2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 1205 - 1208
  • [3] Robust Feature Selection via Simultaneous Capped l2 -Norm and l2,1 -Norm Minimization
    Lan, Gongmin
    Hou, Chenping
    Yi, Dongyun
    PROCEEDINGS OF 2016 IEEE INTERNATIONAL CONFERENCE ON BIG DATA ANALYSIS (ICBDA), 2016, : 147 - 151
  • [4] Discriminative Feature Selection via Joint Trace Ratio Criterion and l2,1-norm Regularization
    Jiang, Zhang
    Zhao, Mingbo
    Kong, Weijian
    2018 IEEE SYMPOSIUM ON PRODUCT COMPLIANCE ENGINEERING - ASIA 2018 (IEEE ISPCE-CN 2018), 2018, : 27 - 32
  • [5] Multi-class feature selection via Sparse Softmax with a discriminative regularization
    Sun, Zhenzhen
    Chen, Zexiang
    Liu, Jinghua
    Yu, Yuanlong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 16 (1) : 159 - 172
  • [6] Cost-sensitive feature selection via the l2,1-norm
    Zhao, Hong
    Yu, Shenglong
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2019, 104 : 25 - 37
  • [7] l2,1 Norm regularized fisher criterion for optimal feature selection
    Zhang, Jian
    Yu, Jun
    Wan, Jian
    Zeng, Zhiqiang
    NEUROCOMPUTING, 2015, 166 : 455 - 463
  • [8] Robust feature selection via l2,1-norm in finite mixture of regression
    Li, Xiangrui
    Zhu, Dongxiao
    PATTERN RECOGNITION LETTERS, 2018, 108 : 15 - 22
  • [9] Unsupervised maximum margin feature selection via L2,1-norm minimization
    Shizhun Yang
    Chenping Hou
    Feiping Nie
    Yi Wu
    Neural Computing and Applications, 2012, 21 : 1791 - 1799
  • [10] Optimal Mean Linear Classifier via Weighted Nuclear Norm and L2,1 Norm
    Zeng Deyu
    Liang Zexiao
    Wu Zongze
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2022, 44 (05) : 1602 - 1609