Deep belief networks with self-adaptive sparsity

被引:3
|
作者
Qiao, Chen [1 ]
Yang, Lan [1 ]
Shi, Yan [1 ]
Fang, Hanfeng [2 ]
Kang, Yanmei [1 ]
机构
[1] Xi An Jiao Tong Univ, Xian 710049, Peoples R China
[2] Suzhou Hanlin Informat Technol Dev Co LTD, Suzhou 215138, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep belief networks; Iterative re-weighted-L-1 minimization algorithm; Self-adaptive sparsity; Contrastive divergence algorithm; Biomedical data; NEURAL-NETWORKS; REPRESENTATIONS; REGULARIZATION; IDENTIFICATION; SCHIZOPHRENIA; CONNECTIVITY;
D O I
10.1007/s10489-021-02361-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
To have the sparsity of deep neural networks is crucial, which can improve the learning ability of them, especially for application to high-dimensional data with small sample size. Commonly used regularization terms for keeping the sparsity of deep neural networks are based on L-1-norm or L-2-norm; however, they are not the most reasonable substitutes of L-0-norm. In this paper, based on the fact that the minimization of a log-sum function is one effective approximation to that of L-0-norm, the sparse penalty term on the connection weights with the log-sum function is introduced. By embedding the corresponding iterative re-weighted-L-1 minimization algorithm with k-step contrastive divergence, the connections of deep belief networks can be updated in a way of sparse self-adaption. Experiments on two kinds of biomedical datasets which are two typical small sample size datasets with a large number of variables, i.e., brain functional magnetic resonance imaging data and single nucleotide polymorphism data, show that the proposed deep belief networks with self-adaptive sparsity can learn the layer-wise sparse features effectively. And results demonstrate better performances including the identification accuracy and sparsity capability than several typical learning machines.
引用
收藏
页码:237 / 253
页数:17
相关论文
共 50 条
  • [1] Deep belief networks with self-adaptive sparsity
    Chen Qiao
    Lan Yang
    Yan Shi
    Hanfeng Fang
    Yanmei Kang
    [J]. Applied Intelligence, 2022, 52 : 237 - 253
  • [2] Dynamic sparsity control in Deep Belief Networks
    Keyvanrad, Mohammad Ali
    Homayounpour, Mohammad Mehdi
    [J]. INTELLIGENT DATA ANALYSIS, 2017, 21 (04) : 963 - 979
  • [3] DeepNetQoE: Self-Adaptive QoE Optimization Framework of Deep Networks
    Wang, Rui
    Chen, Min
    Guizani, Nadra
    Li, Yong
    Gharavi, Hamid
    Hwang, Kai
    [J]. IEEE NETWORK, 2021, 35 (03): : 161 - 167
  • [4] Research on Self-Adaptive Group Key Management in Deep Space Networks
    Jian, Zhou
    Sun Liyan
    Duan Kaiyu
    Yue, Wu
    [J]. WIRELESS PERSONAL COMMUNICATIONS, 2020, 114 (04) : 3435 - 3456
  • [5] Research on Self-Adaptive Group Key Management in Deep Space Networks
    Zhou Jian
    Sun Liyan
    Duan Kaiyu
    Wu Yue
    [J]. Wireless Personal Communications, 2020, 114 : 3435 - 3456
  • [6] Rolling bearing fault diagnosis based on intelligent optimized self-adaptive deep belief network
    Gao, Shuzhi
    Xu, Lintao
    Zhang, Yimin
    Pei, Zhiming
    [J]. MEASUREMENT SCIENCE AND TECHNOLOGY, 2020, 31 (05)
  • [7] Networks of Self-Adaptive Dynamical Systems
    Rodriguez, Julio
    Hongler, Max-Olivier
    [J]. IMA JOURNAL OF APPLIED MATHEMATICS, 2014, 79 (02) : 201 - 240
  • [8] Deep Residual Shrinkage Networks with Self-Adaptive Slope Thresholding for Fault Diagnosis
    Zhang, Zhijin
    Li, He
    Chen, Lei
    [J]. PROCEEDINGS OF 2021 7TH INTERNATIONAL CONFERENCE ON CONDITION MONITORING OF MACHINERY IN NON-STATIONARY OPERATIONS (CMMNO), 2021, : 236 - 239
  • [9] Deep Self-Adaptive Hashing for Image Retrieval
    Lin, Qinghong
    Chen, Xiaojun
    Zhang, Qin
    Tian, Shangxuan
    Chen, Yudong
    [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 1028 - 1037
  • [10] Self-Adaptive Approximate Mobile Deep Learning
    Knez, Timotej
    Machidon, Octavian
    Pejovic, Veljko
    [J]. ELECTRONICS, 2021, 10 (23)