Nonlinear Feature Selection Neural Network via Structured Sparse Regularization

被引:2
|
作者
Wang, Rong [1 ,2 ]
Bian, Jintang [1 ,2 ,3 ]
Nie, Feiping [1 ,2 ,3 ]
Li, Xuelong [1 ,2 ]
机构
[1] Northwestern Polytech Univ, Sch Artificial Intelligence Opt & Elect iOPEN, Xian 710072, Peoples R China
[2] Northwestern Polytech Univ, Key Lab Intelligent Interact & Applicat, Minist Ind & Informat Technol, Xian 710072, Peoples R China
[3] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Peoples R China
基金
中国国家自然科学基金;
关键词
Classification; neural network; nonlinear fea-ture selection; structured sparsity regularization; supervised learning; REPRESENTATION; REGRESSION;
D O I
10.1109/TNNLS.2022.3209716
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection is an important and effective data preprocessing method, which can remove the noise and redundant features while retaining the relevant and discriminative features in high-dimensional data. In real-world applications, the relationships between data samples and their labels are usually nonlinear. However, most of the existing feature selection models focus on learning a linear transformation matrix, which cannot capture such a nonlinear structure in practice and will degrade the performance of downstream tasks. To address the issue, we propose a novel nonlinear feature selection method to select those most relevant and discriminative features in high-dimensional dataset. Specifically, our method learns the nonlinear structure of high-dimensional data by a neural network with cross entropy loss function, and then using the structured sparsity norm such as 12,p-norm to regularize the weights matrix connecting the input layer and the first hidden layer of the neural network model to learn weight of each feature. Therefore, a structural sparse weights matrix is obtained by conducting nonlinear learning based on a neural network with structured sparsity regularization. Then, we use the gradient descent method to achieve the optimal solution of the proposed model. Evaluating the experimental results on several synthetic datasets and real-world datasets shows the effectiveness and superiority of the proposed nonlinear feature selection model.
引用
收藏
页码:9493 / 9505
页数:13
相关论文
共 50 条
  • [41] STRUCTURED SPARSE MODEL BASED FEATURE SELECTION AND CLASSIFICATION FOR HYPERSPECTRAL IMAGERY
    Qian, Yuntao
    Zhou, Jun
    Ye, Minchao
    Wang, Qi
    2011 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2011, : 1771 - 1774
  • [42] Neural network input feature selection using structured l2 - norm penalization
    Egwu, Nathaniel
    Mrziglod, Thomas
    Schuppert, Andreas
    APPLIED INTELLIGENCE, 2023, 53 (05) : 5732 - 5749
  • [43] Brazilian Exchange Rates Forecast via Feature Selection and Artificial Neural Network
    Carneiro de Freitas, A. A.
    Sousa Junior, E. F.
    Oliveira Veras, G. V.
    Santos, W. R. N.
    2021 14TH IEEE INTERNATIONAL CONFERENCE ON INDUSTRY APPLICATIONS (INDUSCON), 2021, : 805 - 812
  • [44] Neural network input feature selection using structured l2 − norm penalization
    Nathaniel Egwu
    Thomas Mrziglod
    Andreas Schuppert
    Applied Intelligence, 2023, 53 : 5732 - 5749
  • [45] Correlation Tracking via Spatial-Temporal Constraints and Structured Sparse Regularization
    Tian, Dan
    Zang, Shouyu
    Tu, Binbin
    IEEE ACCESS, 2021, 9 : 82675 - 82685
  • [46] Semisupervised Feature Selection via Structured Manifold Learning
    Chen, Xiaojun
    Chen, Renjie
    Wu, Qingyao
    Nie, Feiping
    Yang, Min
    Mao, Rui
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (07) : 5756 - 5766
  • [47] Sparse Interaction Additive Networks via Feature Interaction Detection and Sparse Selection
    Enouen, James
    Liu, Yan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [48] IMAGE MODELING AND ENHANCEMENT VIA STRUCTURED SPARSE MODEL SELECTION
    Yu, Guoshen
    Sapiro, Guillermo
    Mallat, Steephane
    2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 1641 - 1644
  • [49] Neural network for a class of sparse optimization with L0-regularization
    Wei, Zhe
    Li, Qingfa
    Wei, Jiazhen
    Bian, Wei
    NEURAL NETWORKS, 2022, 151 : 211 - 221
  • [50] Convergence analysis of BP neural networks via sparse response regularization
    Wang, Jian
    Wen, Yanqing
    Ye, Zhenyun
    Jian, Ling
    Chen, Hua
    APPLIED SOFT COMPUTING, 2017, 61 : 354 - 363