Review of research on restricted Boltzmann machine and its variants

被引:0
|
作者
Wang Q. [1 ]
Gao X. [1 ]
Wu B. [2 ]
Hu Z. [1 ]
Wan K. [1 ]
机构
[1] School of Electronic Information, Northwestern Polytechnical University, Xi'an
[2] Stratégie et Économie d'Entreprise, Paris 1 Panthéon Sorbonne, Paris
关键词
deep learning; feature extraction; probabilistic undirected graph; restricted Boltzmann machine (RBM); restricted Boltzmann machine variants;
D O I
10.12305/j.issn.1001-506X.2024.07.16
中图分类号
学科分类号
摘要
As a typical probabilistic graphical model for learning data distribution and extracting intrinsic features, the restricted Boltzmann machine (RBM) is an important fundamental model in the field of deep learning. In recent years, numerous emerging models, i. e., RBM variants, have been obtained by improving the model structure and energy function of RBM, which can further enhance the feature extraction performance of the model. The study of RBM and its variants can significantly contribute to the development of the deep learning field and realize the rapid extraction of massive information in the era of big data. Based on this, the relevant research on RBM and its variants are systematically reviewed in recent years, and the improvement of training algorithm, model structure, deep model fusion research and the latest application are creatively reviewed. In particular, the focus is on sorting out the develop history of training algorithms and variants for RBM. Finally, the existing difficulties and challenges in the field of RBM and its variants are discussed, and the main research work is summarized and prospected. © 2024 Chinese Institute of Electronics. All rights reserved.
引用
收藏
页码:2323 / 2345
页数:22
相关论文
共 127 条
  • [1] GOODFELLOW I, BENGIO Y, COURVILLE A., Deep learning, (2017)
  • [2] HINTON G E, SALAKIIUTDINOV R R., Reducing the dimensionality of data with neural networks, Science, 313, 5786, pp. 504-507, (2006)
  • [3] HINTON G E, OSINDERO S, TEH Y W., A fast learning algorithm for deep belief nets, Neural Computation, 18, 7, pp. 1527-1554, (2006)
  • [4] REICHERT D P, SERIES P, STORKEY A J., Hallucinations in charles bonnet syndrome induced by homeostasis: a deep Boltzmann machine model, Proc. of the Neural Information Processing Systems Conference, (2010)
  • [5] CHO K H, RAIKO T, ILIN A., Gaussian Bernoulli deep Boltzmann machine, Proc. of the International Joint Conference on Neural Networks, (2013)
  • [6] SCHOLKOPF B, PLATT J, HOFMANN T., Greedy layer-wise training of deep networks, Proc. of the Neural Information Processing Systems Conference, 19, pp. 153-160, (2007)
  • [7] HU M F, ZUO X, LIU J W., Survey on deep generative model[J], Acta Automatica Sinica, 48, 1, pp. 40-74, (2022)
  • [8] FISCHER A., Training restricted Boltzmann machines, Kunstliche Intelligenz, 29, 4, pp. 441-444, (2015)
  • [9] HINTON G E., Training products of experts by minimizing contrastive divergence, Neural Computation, 14, 8, pp. 1771-1800, (2002)
  • [10] TIELEMAN T., Training restricted Boltzmann machines using approximations to the likelihood gradient, Proc. of the International Conference on Machine Learning, (2008)