A universal training scheme and the resulting universality for machine learning phases

被引:3
|
作者
Tseng, Yuan-Heng [1 ]
Jiang, Fu-Jiun [1 ]
Huang, C-Y [2 ]
机构
[1] Natl Taiwan Normal Univ, Dept Phys, 88,Sec 4,Ting Chou Rd, Taipei 116, Taiwan
[2] Tunghai Univ, Dept Appl Phys, 1727,Sec 4,Taiwan Blvd, Taichung 40704, Taiwan
来源
关键词
NEURAL-NETWORK; TRANSITIONS;
D O I
10.1093/ptep/ptac173
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
An autoencoder (AE) and a generative adversarial network (GAN) are trained only once on a one-dimensional (1D) lattice of 200 sites. Moreover, the AE contains only one hidden layer consisting of two neurons, and both the generator and the discriminator of the GAN are made up of two neurons as well. The training set employed to train both the considered unsupervised neural networks (NNs) is composed of two artificial configurations. Remarkably, despite their simple architectures, both the built AE and GAN have precisely determined the critical points of several models, including the three-dimensional classical O(3) model, the two-dimensional generalized classical XY model, the two-dimensional two-state Potts model, and the one-dimensional Bose-Hubbard model. In addition, a factor of several thousands in the speed of calculation is gained for the built AE and GAN when they are compared with the conventional unsupervised NN approaches. The results presented here, as well as those shown previously in the literature, suggest that when phase transitions are considered, an elegant universal neural network that is extremely efficient and is applicable to broad physical systems can be constructed with ease. In particular, since an NN trained with two configurations can be applied to many models, it is likely that when machine learning is concerned, the majority of phase transitions belong to a class having two elements, i.e. the Ising class.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] zkMLaaS: a Verifiable Scheme for Machine Learning as a Service
    Huang, Chenyu
    Wang, Jianzong
    Chen, Huangxun
    Si, Shijing
    Huang, Zhangcheng
    Xiao, Jing
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 5475 - 5480
  • [42] Machine Learning Out-of-Equilibrium Phases of Matter
    Venderley, Jordan
    Khemani, Vedika
    Kim, Eun-Ah
    PHYSICAL REVIEW LETTERS, 2018, 120 (25)
  • [43] Machine learning non-Hermitian topological phases
    Narayan, Brajesh
    Narayan, Awadhesh
    PHYSICAL REVIEW B, 2021, 103 (03)
  • [44] Reconstructing S-matrix Phases with Machine Learning
    Dersy, Aurelien
    Schwartz, Matthew D.
    Zhiboedov, Alexander
    JOURNAL OF HIGH ENERGY PHYSICS, 2024, (05):
  • [45] Machine learning wave functions to identify fractal phases
    Cadez, Tilen
    Dietz, Barbara
    Rosa, Dario
    Andreanov, Alexei
    Slevin, Keith
    Ohtsuki, Tomi
    PHYSICAL REVIEW B, 2023, 108 (18)
  • [46] TRAINING PRIMARY SCHOOL TEACHERS ON UNIVERSAL DESIGN FOR LEARNING
    Gavaldon, Guillermina
    Alba Pastor, Carmen
    EDULEARN15: 7TH INTERNATIONAL CONFERENCE ON EDUCATION AND NEW LEARNING TECHNOLOGIES, 2015, : 3559 - 3566
  • [47] Multilingual Pre-training with Universal Dependency Learning
    Sun, Kailai
    Li, Zuchao
    Zhao, Hai
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [48] DEVELOPING THE SKILLS OF LEARNING IN THE YOUTH TRAINING SCHEME
    DRAKELEY, R
    BULLETIN OF THE BRITISH PSYCHOLOGICAL SOCIETY, 1984, 37 (MAY): : A89 - A89
  • [49] A comment on the training of unsupervised neural networks for learning phases
    Tseng, Yuan-Heng
    Jiang, Fu-Jiun
    RESULTS IN PHYSICS, 2022, 40
  • [50] Machine learning phase transitions of the three-dimensional Ising universality class
    李笑冰
    郭冉冉
    周宇
    刘康宁
    赵佳
    龙芬
    吴元芳
    李治明
    Chinese Physics C, 2023, 47 (03) : 142 - 149