Double sparse-representation feature selection algorithm for classification

被引:0
|
作者
Yonghua Zhu
Xuejun Zhang
Guoqiu Wen
Wei He
Debo Cheng
机构
[1] Guangxi University,School of Computer, Electronics and Information
[2] Guangxi University,Guangxi Key Laboratory of Multimedia Communications and Network Technology (Cultivating Base)
[3] Guangxi Normal University,College of Computer Science and Information Technology
[4] Guangxi Normal University,Guangxi Key Lab of Multi
来源
关键词
Feature selection; Joint sparse learning; Self-representation;
D O I
暂无
中图分类号
学科分类号
摘要
since amount of unlabeled and high-dimensional datasets need to be preprocessed, unsupervised learning plays a more and more important role in machine learning field. This paper proposed a novel unsupervised feature selection algorithm that can select informative features from dataset without label, by mixing two sparse representation and self-representation loss function into a unified framework. That is, we use self-representation loss function to represent every feature with remainder features and achieve minimum reconstruction mirror, and then utilize l2 , 1-norm regularization term and l1-norm regularization term simultaneously to enforce coefficient matrix to be sparse, such that filter redundant and irrelative features in order to conduct feature selection, where l2 , 1-norm regularization can enforce group sparsity while l1-norm regularization enforce element sparsity. By this way that utilize both of sparse representation terms, we can choose representative features more accurately. At final, we feed reduced data into support vector machine (SVM) to conduct classification accuracy, which is main assessment criteria to validate performance of algorithm. Extensive experiments on synthetic datasets and real-world datasets have exhibited that our proposed method outperform most of common-used methods, such as PCA, LPP and so on.
引用
收藏
页码:17525 / 17539
页数:14
相关论文
共 50 条
  • [1] Double sparse-representation feature selection algorithm for classification
    Zhu, Yonghua
    Zhang, Xuejun
    Wen, Guoqiu
    He, Wei
    Cheng, Debo
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2017, 76 (16) : 17525 - 17539
  • [2] Unsupervised Feature Selection Algorithm Based on Sparse Representation
    Cui, Guoqing
    Yang, Jie
    Zareapoor, Masoumeh
    Wang, Jiechen
    [J]. 2016 3RD INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2016, : 1028 - 1033
  • [3] Feature Selection Tracking Algorithm Based on Sparse Representation
    Lou, Hui-dong
    Li, Wei-guang
    Hou, Yue-en
    Yao, Qing-he
    Ye, Guo-qiang
    Wan, Hao
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2015, 2015
  • [4] A learning generalization bound with an application to sparse-representation classifiers
    Gat, Y
    [J]. MACHINE LEARNING, 2001, 42 (03) : 233 - 239
  • [5] Feature selection via kernel sparse representation
    Lv, Zhizheng
    Li, Yangding
    Li, Jieye
    [J]. 2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2637 - 2644
  • [6] Sparse Representation Preserving for Unsupervised Feature Selection
    Yan, Hui
    Jin, Zhong
    Yang, Jian
    [J]. 2014 22ND INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2014, : 1574 - 1578
  • [7] Adaptive feature transformation for classification with sparse representation
    Sun, Yaxin
    Wen, Guihua
    [J]. OPTIK, 2015, 126 (23): : 4452 - 4459
  • [8] Feature selection based on sparse representation with the measures of classification error rate and complexity of boundary
    Deng, Yanli
    Jin, Weidong
    [J]. OPTIK, 2015, 126 (20): : 2634 - 2639
  • [9] A Learning Generalization Bound with an Application to Sparse-Representation Classifiers
    Yoram Gat
    [J]. Machine Learning, 2001, 42 : 233 - 239
  • [10] Feature selection by combining subspace learning with sparse representation
    Cheng, Debo
    Zhang, Shichao
    Liu, Xingyi
    Sun, Ke
    Zong, Ming
    [J]. MULTIMEDIA SYSTEMS, 2017, 23 (03) : 285 - 291