Nonlinear Feature Selection Neural Network via Structured Sparse Regularization

被引:2
|
作者
Wang, Rong [1 ,2 ]
Bian, Jintang [1 ,2 ,3 ]
Nie, Feiping [1 ,2 ,3 ]
Li, Xuelong [1 ,2 ]
机构
[1] Northwestern Polytech Univ, Sch Artificial Intelligence Opt & Elect iOPEN, Xian 710072, Peoples R China
[2] Northwestern Polytech Univ, Key Lab Intelligent Interact & Applicat, Minist Ind & Informat Technol, Xian 710072, Peoples R China
[3] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Peoples R China
基金
中国国家自然科学基金;
关键词
Classification; neural network; nonlinear fea-ture selection; structured sparsity regularization; supervised learning; REPRESENTATION; REGRESSION;
D O I
10.1109/TNNLS.2022.3209716
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection is an important and effective data preprocessing method, which can remove the noise and redundant features while retaining the relevant and discriminative features in high-dimensional data. In real-world applications, the relationships between data samples and their labels are usually nonlinear. However, most of the existing feature selection models focus on learning a linear transformation matrix, which cannot capture such a nonlinear structure in practice and will degrade the performance of downstream tasks. To address the issue, we propose a novel nonlinear feature selection method to select those most relevant and discriminative features in high-dimensional dataset. Specifically, our method learns the nonlinear structure of high-dimensional data by a neural network with cross entropy loss function, and then using the structured sparsity norm such as 12,p-norm to regularize the weights matrix connecting the input layer and the first hidden layer of the neural network model to learn weight of each feature. Therefore, a structural sparse weights matrix is obtained by conducting nonlinear learning based on a neural network with structured sparsity regularization. Then, we use the gradient descent method to achieve the optimal solution of the proposed model. Evaluating the experimental results on several synthetic datasets and real-world datasets shows the effectiveness and superiority of the proposed nonlinear feature selection model.
引用
收藏
页码:9493 / 9505
页数:13
相关论文
共 50 条
  • [1] Feature Selection of Network Data VIA ℓ2,p Regularization
    Ruizhi Zhou
    Lingfeng Niu
    Cognitive Computation, 2020, 12 : 1217 - 1232
  • [2] Structured Sparsity of Convolutional Neural Networks via Nonconvex Sparse Group Regularization
    Bui, Kevin
    Park, Fredrick
    Zhang, Shuai
    Qi, Yingyong
    Xin, Jack
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2021, 6
  • [3] Discriminative Feature Selection via A Structured Sparse Subspace Learning Module
    Wang, Zheng
    Nie, Feiping
    Tian, Lai
    Wang, Rong
    Li, Xuelong
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3009 - 3015
  • [4] Multi-class feature selection via Sparse Softmax with a discriminative regularization
    Sun, Zhenzhen
    Chen, Zexiang
    Liu, Jinghua
    Yu, Yuanlong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2025, 16 (01) : 159 - 172
  • [5] Multi-label feature selection via robust flexible sparse regularization
    Li, Yonghao
    Hu, Liang
    Gao, Wanfu
    PATTERN RECOGNITION, 2023, 134
  • [6] Semi-supervised feature selection analysis with structured multi-view sparse regularization
    Shi, Caijuan
    Duan, Changyu
    Gu, Zhibin
    Tian, Qi
    An, Gaoyun
    Zhao, Ruizhen
    NEUROCOMPUTING, 2019, 330 : 412 - 424
  • [7] Deep Neural Network Regularization for Feature Selection in Learning-to-Rank
    Rahangdale, Ashwini
    Raut, Shital
    IEEE ACCESS, 2019, 7 : 53988 - 54006
  • [8] BP Neural Network Feature Selection Based on Group Lasso Regularization
    Liu, Tiqian
    Xiao, Jiang-Wen
    Huang, Zhengyi
    Kong, Erdan
    Liang, Yuntao
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 2786 - 2790
  • [9] Non-linear Feature Selection Based on Convolution Neural Networks with Sparse Regularization
    Wu, Wen-Bin
    Chen, Si-Bao
    Ding, Chris
    Luo, Bin
    COGNITIVE COMPUTATION, 2024, 16 (02) : 654 - 670
  • [10] Non-linear Feature Selection Based on Convolution Neural Networks with Sparse Regularization
    Wen-Bin Wu
    Si-Bao Chen
    Chris Ding
    Bin Luo
    Cognitive Computation, 2024, 16 : 654 - 670