Deep Feature Selection using an Enhanced Sparse Group Lasso Algorithm

被引:0
|
作者
Farokhmanesh, Fatemeh [1 ]
Sadeghi, Mohammad Taghi [1 ]
机构
[1] Yazd Univ, Dept Elect Engn, Yazd, Iran
关键词
feature selection; lasso; sparse representation; deep learning; REGRESSION;
D O I
10.1109/iraniancee.2019.8786386
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Feature selection is an important method of data dimensionality reduction widely used in machine learning. In this framework, the sparse representation based feature selection methods are very attractive. This is because of the nature of these methods which try to represent a data with as less as possible non-zero coefficients. In deep neural networks, a very high dimensional feature space is usually existed. In such a situation, one can take advantages of the feature selection approaches into account. In this paper, first, three sparse feature selection methods are compared. The Sparse Group Lasso (SGL) algorithm is one of the adopted approaches. This method is theoretically very well-organized and leads to good results for man-made features. The most important property of this method is that it highly induces the sparsity to the data. A main step of the SGL method is the features grouping step. In this paper, a k-means clustering based method is applied for grouping of the features. Our experimental results show that this sparse representation based method leads to very successful results in deep neural networks.
引用
收藏
页码:1549 / 1552
页数:4
相关论文
共 50 条
  • [1] Sparse group LASSO based uncertain feature selection
    Zongxia Xie
    Yong Xu
    International Journal of Machine Learning and Cybernetics, 2014, 5 : 201 - 210
  • [2] Sparse group LASSO based uncertain feature selection
    Xie, Zongxia
    Xu, Yong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2014, 5 (02) : 201 - 210
  • [3] Heterogeneous Feature Selection With Multi-Modal Deep Neural Networks and Sparse Group LASSO
    Zhao, Lei
    Hu, Qinghua
    Wang, Wenwu
    IEEE TRANSACTIONS ON MULTIMEDIA, 2015, 17 (11) : 1936 - 1948
  • [4] Feature Selection for Neural Networks Using Group Lasso Regularization
    Zhang, Huaqing
    Wang, Jian
    Sun, Zhanquan
    Zurada, Jacek M.
    Pal, Nikhil R.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (04) : 659 - 673
  • [5] Simultaneous Channel and Feature Selection of Fused EEG Features Based on Sparse Group Lasso
    Wang, Jin-Jia
    Xue, Fang
    Li, Hui
    BIOMED RESEARCH INTERNATIONAL, 2015, 2015
  • [6] Feature Selection for Fuzzy Neural Networks using Group Lasso Regularization
    Gao, Tao
    Bai, Xiao
    Zhang, Liang
    Wang, Jian
    2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
  • [7] Discriminative Feature Selection for Multiple Ocular Diseases Classification by Sparse Induced Graph Regularized Group Lasso
    Chen, Xiangyu
    Xu, Yanwu
    Yan, Shuicheng
    Chua, Tat-Seng
    Wong, Damon Wing Kee
    Wong, Tien Yin
    Liu, Jiang
    MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION - MICCAI 2015, PT II, 2015, 9350 : 11 - 19
  • [8] Feature Selection Using a Neural Network With Group Lasso Regularization and Controlled Redundancy
    Wang, Jian
    Zhang, Huaqing
    Wang, Junze
    Pu, Yifei
    Pal, Nikhil R.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (03) : 1110 - 1123
  • [9] Speech bottleneck feature extraction method based on overlapping group lasso sparse deep neural network
    Luo, Yuan
    Liu, Yu
    Zhang, Yi
    Yue, Congcong
    SPEECH COMMUNICATION, 2018, 99 : 56 - 61
  • [10] Sparse group variable selection based on quantile hierarchical Lasso
    Zhao, Weihua
    Zhang, Riquan
    Liu, Jicai
    JOURNAL OF APPLIED STATISTICS, 2014, 41 (08) : 1658 - 1677