Distribution-dependent feature selection for deep neural networks

被引:1
|
作者
Zhao, Xuebin [1 ]
Li, Weifu [1 ]
Chen, Hong [1 ]
Wang, Yingjie [2 ]
Chen, Yanhong [3 ]
John, Vijay [4 ]
机构
[1] Huazhong Agr Univ, Coll Sci, Wuhan 430062, Peoples R China
[2] Huazhong Agr Univ, Coll Informat, Wuhan 430062, Peoples R China
[3] Chinese Acad Sci, Natl Space Sci Ctr, Beijing 100190, Peoples R China
[4] Toyota Technol Inst, Res Ctr Smart Vehicles, Tempaku Ku, 2-12-1 Hisakata, Nagoya, Aichi 4688511, Japan
基金
中国国家自然科学基金;
关键词
Feature selection; Coronal mass ejections; Deep neural networks; Interpretability; Hypothesis-testing; FALSE DISCOVERY RATE; REGRESSION; FILTER;
D O I
10.1007/s10489-021-02663-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While deep neural networks (DNNs) have achieved impressive performance on a wide variety of tasks, the black-box nature hinders their applicability to high-risk, decision-making fields. In such fields, besides accurate prediction, it is also desired to provide interpretable insights into DNNs, e.g., screening important features based on their contributions to predictive accuracy. To improve the interpretability of DNNs, this paper originally proposes a new feature selection algorithm for DNNs by integrating the knockoff technique and the distribution information of irrelevant features. With the help of knockoff features and central limit theorem, we state that the irrelevant feature's statistic follows a known Gaussian distribution under mild conditions. This information is applied in hypothesis testing to discover key features associated with the DNNs. Empirical evaluations on simulated data demonstrate that the proposed method can select more true informative features with higher F-1 scores. Meanwhile, the Friedman test and the post-hoc Nemenyi test are employed to validate the superiority of the proposed method. Then we apply our method to Coronal Mass Ejections (CME) data and uncover the key features which contribute to the DNN-based CME arrival time.
引用
收藏
页码:4432 / 4442
页数:11
相关论文
共 50 条
  • [1] Distribution-dependent feature selection for deep neural networks
    Xuebin Zhao
    Weifu Li
    Hong Chen
    Yingjie Wang
    Yanhong Chen
    Vijay John
    Applied Intelligence, 2022, 52 : 4432 - 4442
  • [2] Feature Selection using Deep Neural Networks
    Roy, Debaditya
    Murty, K. Sri Rama
    Mohan, C. Krishna
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [3] CancelOut: A Layer for Feature Selection in Deep Neural Networks
    Borisov, Vadim
    Haug, Johannes
    Kasneci, Gjergji
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II, 2019, 11728 : 72 - 83
  • [4] DeepPINK: reproducible feature selection in deep neural networks
    Lu, Yang Young
    Fan, Yingying
    Lv, Jinchi
    Noble, William Stafford
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [5] Mixture of Deep Neural Networks for Instancewise Feature Selection
    Xiao, Qi
    Wang, Zhengdao
    2019 57TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2019, : 917 - 921
  • [6] Nonparametric feature selection by random forests and deep neural networks
    Mao, Xiaojun
    Peng, Liuhua
    Wang, Zhonglei
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2022, 170
  • [7] Feature Selection for Deep Neural Networks in Cyber Security Applications
    Davis, Alexander
    Gill, Sumanjit
    Wong, Robert
    Tayeb, Shahab
    2020 IEEE INTERNATIONAL IOT, ELECTRONICS AND MECHATRONICS CONFERENCE (IEMTRONICS 2020), 2020, : 82 - 88
  • [8] A Deep-Layer Feature Selection Method Based on Deep Neural Networks
    Qiao, Chen
    Sun, Ke-Feng
    Li, Bin
    ADVANCES IN SWARM INTELLIGENCE, ICSI 2018, PT II, 2018, 10942 : 542 - 551
  • [9] Feature selection with neural networks
    Verikas, A
    Bacauskiene, M
    PATTERN RECOGNITION LETTERS, 2002, 23 (11) : 1323 - 1335
  • [10] Feature Selection With Neural Networks
    Philippe Leray
    Patrick Gallinari
    Behaviormetrika, 1999, 26 (1) : 145 - 166