A Deep-Layer Feature Selection Method Based on Deep Neural Networks

被引:1
|
作者
Qiao, Chen [1 ]
Sun, Ke-Feng [1 ]
Li, Bin [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Peoples R China
关键词
Features back-selection; Deep neural networks; Deep-layer architecture; Key sites;
D O I
10.1007/978-3-319-93818-9_52
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Inspired by the sparse mechanism of the biological nervous system, we propose a novel feature selection algorithm: features back-selection (FBS) method, which is based on the deep learning architecture. Compared with the existing feature selection method, this method is no longer a shallow layer approach, since it is from the global perspective, which traces back step by step to the original key feature sites of the raw data by the abstract features learned from the top of the deep neural networks. For MNIST data, the FBS method has quite well performance on searching for the original important pixels of the digit data. It shows that the FBS method not only can determine the relevant features for learning task with keeping a quite high prediction accuracy, but also can reduce the space of data storage as well as the computational complexity.
引用
收藏
页码:542 / 551
页数:10
相关论文
共 50 条
  • [1] CancelOut: A Layer for Feature Selection in Deep Neural Networks
    Borisov, Vadim
    Haug, Johannes
    Kasneci, Gjergji
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II, 2019, 11728 : 72 - 83
  • [2] Feature Selection using Deep Neural Networks
    Roy, Debaditya
    Murty, K. Sri Rama
    Mohan, C. Krishna
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [3] DeepPINK: reproducible feature selection in deep neural networks
    Lu, Yang Young
    Fan, Yingying
    Lv, Jinchi
    Noble, William Stafford
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [4] Mixture of Deep Neural Networks for Instancewise Feature Selection
    Xiao, Qi
    Wang, Zhengdao
    2019 57TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2019, : 917 - 921
  • [5] Distribution-dependent feature selection for deep neural networks
    Xuebin Zhao
    Weifu Li
    Hong Chen
    Yingjie Wang
    Yanhong Chen
    Vijay John
    Applied Intelligence, 2022, 52 : 4432 - 4442
  • [6] Nonparametric feature selection by random forests and deep neural networks
    Mao, Xiaojun
    Peng, Liuhua
    Wang, Zhonglei
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2022, 170
  • [7] Distribution-dependent feature selection for deep neural networks
    Zhao, Xuebin
    Li, Weifu
    Chen, Hong
    Wang, Yingjie
    Chen, Yanhong
    John, Vijay
    APPLIED INTELLIGENCE, 2022, 52 (04) : 4432 - 4442
  • [8] Feature Selection for Deep Neural Networks in Cyber Security Applications
    Davis, Alexander
    Gill, Sumanjit
    Wong, Robert
    Tayeb, Shahab
    2020 IEEE INTERNATIONAL IOT, ELECTRONICS AND MECHATRONICS CONFERENCE (IEMTRONICS 2020), 2020, : 82 - 88
  • [9] Deep-gKnock: Nonlinear group-feature selection with deep neural networks
    Zhu, Guangyu
    Zhao, Tingting
    NEURAL NETWORKS, 2021, 135 : 139 - 147
  • [10] DOA Estimation by Feature Extraction Based on Parallel Deep Neural Networks and MRMR Feature Selection Algorithm
    Al-Tameemi, Ashwaq Neaman Hassan
    Feghhi, Mahmood Mohassel
    Tazehkand, Behzad Mozaffari
    IEEE ACCESS, 2025, 13 : 40480 - 40502