Distributed Learning of Deep Sparse Neural Networks for High-dimensional Classification

被引:0
|
作者
Garg, Shweta [1 ]
Krishnan, R. [1 ]
Jagannathan, S. [1 ]
Samaranayake, V. A. [2 ]
机构
[1] Missouri Univ Sci & Technol, Dept Elect & Comp Engn, Rolla, MO 65409 USA
[2] Missouri Univ Sci & Technol, Dept Math & Stat, Rolla, MO USA
关键词
VARIABLE SELECTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While analyzing high dimensional data-sets using deep neural network (NN), increased sparsity is desirable but requires careful selection of "sparsity parameters." In this paper, a novel distributed learning methodology is proposed to optimize the NN while addressing this challenge. To address this challenge, the optimal sparsity in the NN is estimated via a two player zero-sum game in the paper. In the proposed game, sparsity parameter is the first player with the aim of increasing sparsity in the NN while NN weights is the second player with the goal of improving its performance in the presence of increased sparsity. To solve the game, additional variables are introduced into the optimization problem such that the output at every layer in the NN depends on this variable instead of the previous layer. Using these additional variables, layer wise cost-functions are derived that are then independently optimized to learn the additional variables, NN weights and the sparsity parameters. To implement the proposed learning procedure in a parallelized and distributed environment, a novel computational algorithm is also proposed. The efficiency of the proposed approach is demonstrated using a total of six data-sets.
引用
收藏
页码:1587 / 1592
页数:6
相关论文
共 50 条
  • [1] Distributed Min-Max Learning Scheme for Neural Networks With Applications to High-Dimensional Classification
    Raghavan, Krishnan
    Garg, Shweta
    Jagannathan, Sarangapani
    Samaranayake, V. A.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (10) : 4323 - 4333
  • [2] Minimax optimal high-dimensional classification using deep neural networks
    Wang, Shuoyang
    Shang, Zuofeng
    STAT, 2022, 11 (01):
  • [3] On the classification consistency of high-dimensional sparse neural network
    Yang, Kaixu
    Maiti, Taps
    2019 IEEE INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA 2019), 2019, : 173 - 182
  • [4] Deep ReLU neural networks in high-dimensional approximation
    Dung, Dinh
    Nguyen, Van Kien
    NEURAL NETWORKS, 2021, 142 : 619 - 635
  • [5] Classification of sparse high-dimensional vectors
    Ingster, Yuri I.
    Pouet, Christophe
    Tsybakov, Alexandre B.
    PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2009, 367 (1906): : 4427 - 4448
  • [6] Classification with High-Dimensional Sparse Samples
    Huang, Dayu
    Meyn, Sean
    2012 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2012,
  • [7] On the challenges of learning with inference networks on sparse, high-dimensional data
    Krishnan, Rahul G.
    Liang, Dawen
    Hoffman, Matthew D.
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [8] High-dimensional Data Stream Classification via Sparse Online Learning
    Wang, Dayong
    Wu, Pengcheng
    Zhao, Peilin
    Wu, Yue
    Miao, Chunyan
    Hoi, Steven C. H.
    2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2014, : 1007 - 1012
  • [9] Siamese neural networks for the classification of high-dimensional radiomic features
    Mahajan, Abhishaike
    Dormer, James
    Li, Qinmei
    Chen, Deji
    Zhang, Zhenfeng
    Fei, Baowei
    MEDICAL IMAGING 2020: COMPUTER-AIDED DIAGNOSIS, 2020, 11314
  • [10] XDL: An Industrial Deep Learning Framework for High-dimensional Sparse Data
    Jiang, Biye
    Deng, Chao
    Yi, Huimin
    Hu, Zelin
    Zhou, Guorui
    Zheng, Yang
    Huang, Sui
    Guo, Xinyang
    Wang, Dongyue
    Song, Yue
    Zhao, Liqin
    Wang, Zhi
    Sun, Peng
    Zhang, Yu
    Zhang, Di
    Li, Jinhui
    Xu, Jian
    Zhu, Xiaoqiang
    Gai, Kun
    1ST INTERNATIONAL WORKSHOP ON DEEP LEARNING PRACTICE FOR HIGH-DIMENSIONAL SPARSE DATA WITH KDD (DLP-KDD 2019), 2019,