A universal training scheme and the resulting universality for machine learning phases

被引:3
|
作者
Tseng, Yuan-Heng [1 ]
Jiang, Fu-Jiun [1 ]
Huang, C-Y [2 ]
机构
[1] Natl Taiwan Normal Univ, Dept Phys, 88,Sec 4,Ting Chou Rd, Taipei 116, Taiwan
[2] Tunghai Univ, Dept Appl Phys, 1727,Sec 4,Taiwan Blvd, Taichung 40704, Taiwan
来源
关键词
NEURAL-NETWORK; TRANSITIONS;
D O I
10.1093/ptep/ptac173
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
An autoencoder (AE) and a generative adversarial network (GAN) are trained only once on a one-dimensional (1D) lattice of 200 sites. Moreover, the AE contains only one hidden layer consisting of two neurons, and both the generator and the discriminator of the GAN are made up of two neurons as well. The training set employed to train both the considered unsupervised neural networks (NNs) is composed of two artificial configurations. Remarkably, despite their simple architectures, both the built AE and GAN have precisely determined the critical points of several models, including the three-dimensional classical O(3) model, the two-dimensional generalized classical XY model, the two-dimensional two-state Potts model, and the one-dimensional Bose-Hubbard model. In addition, a factor of several thousands in the speed of calculation is gained for the built AE and GAN when they are compared with the conventional unsupervised NN approaches. The results presented here, as well as those shown previously in the literature, suggest that when phase transitions are considered, an elegant universal neural network that is extremely efficient and is applicable to broad physical systems can be constructed with ease. In particular, since an NN trained with two configurations can be applied to many models, it is likely that when machine learning is concerned, the majority of phase transitions belong to a class having two elements, i.e. the Ising class.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] A universal scheme for learning
    Farias, VF
    Moallemi, CC
    Van Roy, B
    Weissman, T
    2005 IEEE International Symposium on Information Theory (ISIT), Vols 1 and 2, 2005, : 1158 - 1162
  • [2] Machine learning phases and criticalities without using real data for training
    Tan, D-R
    Jiang, F-J
    PHYSICAL REVIEW B, 2020, 102 (22)
  • [3] Universal Identification Scheme in Machine-to-Machine Systems
    Katusic, Damjan
    Skocir, Pavle
    Bojic, Iva
    Kusek, Mario
    Jezic, Gordan
    Desic, Sasa
    Huljenic, Darko
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS (CONTEL 2013), 2013, : 71 - 78
  • [4] Overlapped Data Processing Scheme for Accelerating Training and Validation in Machine Learning
    Choi, Jinseo
    Kang, Donghyun
    IEEE ACCESS, 2022, 10 : 72015 - 72023
  • [5] A universal neural network for learning phases
    D.-R. Tan
    J.-H. Peng
    Y.-H. Tseng
    F.-J. Jiang
    The European Physical Journal Plus, 136
  • [6] A universal neural network for learning phases
    Tan, D-R
    Peng, J-H
    Tseng, Y-H
    Jiang, F-J
    EUROPEAN PHYSICAL JOURNAL PLUS, 2021, 136 (11):
  • [7] Universality class of machine learning for critical phenomena
    Hu, Gaoke
    Sun, Yu
    Liu, Teng
    Zhang, Yongwen
    Liu, Maoxin
    Fan, Jingfang
    Chen, Wei
    Chen, Xiaosong
    SCIENCE CHINA-PHYSICS MECHANICS & ASTRONOMY, 2023, 66 (12)
  • [8] Universality class of machine learning for critical phenomena
    Gaoke Hu
    Yu Sun
    Teng Liu
    Yongwen Zhang
    Maoxin Liu
    Jingfang Fan
    Wei Chen
    Xiaosong Chen
    Science China(Physics,Mechanics & Astronomy), 2023, Mechanics & Astronomy)2023 (12) : 67 - 74
  • [9] Universality class of machine learning for critical phenomena
    Gaoke Hu
    Yu Sun
    Teng Liu
    Yongwen Zhang
    Maoxin Liu
    Jingfang Fan
    Wei Chen
    Xiaosong Chen
    Science China Physics, Mechanics & Astronomy, 2023, 66
  • [10] Machine learning phases of matter
    Carrasquilla, Juan
    Melko, Roger G.
    NATURE PHYSICS, 2017, 13 (05) : 431 - 434