Neural Rule Ensembles: Encoding Sparse Feature Interactions into Neural Networks

被引:1
|
作者
Dawer, Gitesh [1 ]
Guo, Yangzi [2 ]
Liu, Sida [3 ]
Barbu, Adrian [3 ]
机构
[1] Apple Inc, CoreML Grp, Cupertino, CA 95014 USA
[2] Florida State Univ, Dept Math, Tallahassee, FL 32306 USA
[3] Florida State Univ, Dept Stat, Tallahassee, FL 32306 USA
关键词
APPROXIMATION;
D O I
10.1109/ijcnn48605.2020.9206956
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial Neural Networks form the basis of very powerful learning methods. It has been observed that a naive application of fully connected neural networks to data with many irrelevant variables often leads to overfitting. In an attempt to circumvent this issue, a prior knowledge pertaining to what features are relevant and their possible feature interactions can be encoded into these networks. In this work, we use decision trees to capture such relevant features and their interactions and define a mapping to encode extracted relationships into a neural network. This addresses the initialization related concerns of fully connected neural networks. At the same time through feature selection it enables learning of compact representations compared to state of the art tree-based approaches. Empirical evaluations and simulation studies show the superiority of such an approach over fully connected neural networks and tree-based approaches.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Encoding of social novelty by sparse GABAergic neural ensembles in the prelimbic cortex
    Zhao, Zhe
    Zeng, Fengqingyang
    Wang, Hanbin
    Wu, Runlong
    Chen, Liping
    Wu, Yan
    Li, Shen
    Shao, Jingyuan
    Wang, Yao
    Wu, Junjie
    Feng, Zhiheng
    Gao, Weizheng
    Hu, Yanhui
    Wang, Aimin
    Cheng, Heping
    Zhang, Jue
    Chen, Liangyi
    Wu, Haitao
    [J]. SCIENCE ADVANCES, 2022, 8 (35):
  • [2] Smaller Together: Groupwise Encoding of Sparse Neural Networks
    Trommer, Elias
    Waschneck, Bernd
    Kumar, Akash
    [J]. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 2024, 43 (12) : 4476 - 4489
  • [3] ENSEMBLES OF NEURAL NETWORKS BASED ON THE ALTERATION OF INPUT FEATURE VALUES
    Akhand, M. A. H.
    Murase, K.
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2012, 22 (01) : 77 - 87
  • [4] A Multiobjective Sparse Feature Learning Model for Deep Neural Networks
    Gong, Maoguo
    Liu, Jia
    Li, Hao
    Cai, Qing
    Su, Linzhi
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (12) : 3263 - 3277
  • [5] Factorized weight interaction neural networks for sparse feature prediction
    Zou, Dafang
    Sheng, Mengmeng
    Yu, Hui
    Mao, Jiafa
    Chen, Shengyong
    Sheng, Weiguo
    [J]. NEURAL COMPUTING & APPLICATIONS, 2020, 32 (13): : 9567 - 9579
  • [6] Factorized weight interaction neural networks for sparse feature prediction
    Dafang Zou
    Mengmeng Sheng
    Hui Yu
    Jiafa Mao
    Shengyong Chen
    Weiguo Sheng
    [J]. Neural Computing and Applications, 2020, 32 : 9567 - 9579
  • [7] Classification by Ensembles of Neural Networks
    Kozyrev, S., V
    [J]. P-ADIC NUMBERS ULTRAMETRIC ANALYSIS AND APPLICATIONS, 2012, 4 (01) : 27 - 33
  • [8] Coupled ensembles of neural networks
    Dutt, Anuvabh
    Pellerin, Denis
    Quenot, Georges
    [J]. NEUROCOMPUTING, 2020, 396 : 346 - 357
  • [9] Classification by ensembles of neural networks
    S. V. Kozyrev
    [J]. P-Adic Numbers, Ultrametric Analysis, and Applications, 2012, 4 (1) : 27 - 33
  • [10] Coupled Ensembles of Neural Networks
    Dutt, Anuvabh
    Pellerin, Denis
    Quenot, Georges
    [J]. 2018 16TH INTERNATIONAL CONFERENCE ON CONTENT-BASED MULTIMEDIA INDEXING (CBMI), 2018,