Nonlinear Feature Selection Neural Network via Structured Sparse Regularization

被引:2
|
作者
Wang, Rong [1 ,2 ]
Bian, Jintang [1 ,2 ,3 ]
Nie, Feiping [1 ,2 ,3 ]
Li, Xuelong [1 ,2 ]
机构
[1] Northwestern Polytech Univ, Sch Artificial Intelligence Opt & Elect iOPEN, Xian 710072, Peoples R China
[2] Northwestern Polytech Univ, Key Lab Intelligent Interact & Applicat, Minist Ind & Informat Technol, Xian 710072, Peoples R China
[3] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Peoples R China
基金
中国国家自然科学基金;
关键词
Classification; neural network; nonlinear fea-ture selection; structured sparsity regularization; supervised learning; REPRESENTATION; REGRESSION;
D O I
10.1109/TNNLS.2022.3209716
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection is an important and effective data preprocessing method, which can remove the noise and redundant features while retaining the relevant and discriminative features in high-dimensional data. In real-world applications, the relationships between data samples and their labels are usually nonlinear. However, most of the existing feature selection models focus on learning a linear transformation matrix, which cannot capture such a nonlinear structure in practice and will degrade the performance of downstream tasks. To address the issue, we propose a novel nonlinear feature selection method to select those most relevant and discriminative features in high-dimensional dataset. Specifically, our method learns the nonlinear structure of high-dimensional data by a neural network with cross entropy loss function, and then using the structured sparsity norm such as 12,p-norm to regularize the weights matrix connecting the input layer and the first hidden layer of the neural network model to learn weight of each feature. Therefore, a structural sparse weights matrix is obtained by conducting nonlinear learning based on a neural network with structured sparsity regularization. Then, we use the gradient descent method to achieve the optimal solution of the proposed model. Evaluating the experimental results on several synthetic datasets and real-world datasets shows the effectiveness and superiority of the proposed nonlinear feature selection model.
引用
收藏
页码:9493 / 9505
页数:13
相关论文
共 50 条
  • [31] Feature flow regularization: Improving structured sparsity in deep neural networks
    Wu, Yue
    Lan, Yuan
    Zhang, Luchan
    Xiang, Yang
    NEURAL NETWORKS, 2023, 161 : 598 - 613
  • [32] A quantitative benchmark of neural network feature selection methods for detecting nonlinear signals
    Passemiers, Antoine
    Folco, Pietro
    Raimondi, Daniele
    Birolo, Giovanni
    Moreau, Yves
    Fariselli, Piero
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [33] Feature extraction and selection of neural network
    Wu, CD
    Gao, F
    Ma, SH
    PROCEEDINGS OF THE 3RD WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-5, 2000, : 1103 - 1106
  • [34] Feature Selection for Neural Networks Using Group Lasso Regularization
    Zhang, Huaqing
    Wang, Jian
    Sun, Zhanquan
    Zurada, Jacek M.
    Pal, Nikhil R.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (04) : 659 - 673
  • [35] Deep Neural Networks Pruning via the Structured Perspective Regularization
    Cacciola, Matteo
    Frangioni, Antonio
    Li, Xinlin
    Lodi, Andrea
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2023, 5 (04): : 1051 - 1077
  • [36] Sparse neural network regression with variable selection
    Shin, Jae-Kyung
    Bak, Kwan-Young
    Koo, Ja-Yong
    COMPUTATIONAL INTELLIGENCE, 2022, 38 (06) : 2075 - 2094
  • [37] Sparse Neural Additive Model: Interpretable Deep Learning with Feature Selection via Group Sparsity
    Xu, Shiyun
    Bu, Zhiqi
    Chaudhari, Pratik
    Barnett, Ian J.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT III, 2023, 14171 : 343 - 359
  • [38] Robust tracking via discriminative sparse feature selection
    Zhan, Jin
    Su, Zhuo
    Wu, Hefeng
    Luo, Xiaonan
    VISUAL COMPUTER, 2015, 31 (05): : 575 - 588
  • [39] Tagging Chinese Microblogger via Sparse Feature Selection
    Shang, Di
    Dai, Xin-Yu
    Huang, Shujian
    Li, Yi
    Chen, Jiajun
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 2460 - 2467
  • [40] Robust tracking via discriminative sparse feature selection
    Jin Zhan
    Zhuo Su
    Hefeng Wu
    Xiaonan Luo
    The Visual Computer, 2015, 31 : 575 - 588