Variational Monte Carlo with large patched transformers

被引:1
|
作者
Sprague, Kyle [1 ]
Czischek, Stefanie [1 ]
机构
[1] Univ Ottawa, Dept Phys, Ottawa, ON K1N 6N5, Canada
关键词
QUANTUM; ATOM;
D O I
10.1038/s42005-024-01584-y
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Large language models, like transformers, have recently demonstrated immense powers in text and image generation. This success is driven by the ability to capture long-range correlations between elements in a sequence. The same feature makes the transformer a powerful wavefunction ansatz that addresses the challenge of describing correlations in simulations of qubit systems. Here we consider two-dimensional Rydberg atom arrays to demonstrate that transformers reach higher accuracies than conventional recurrent neural networks for variational ground state searches. We further introduce large, patched transformer models, which consider a sequence of large atom patches, and show that this architecture significantly accelerates the simulations. The proposed architectures reconstruct ground states with accuracies beyond state-of-the-art quantum Monte Carlo methods, allowing for the study of large Rydberg systems in different phases of matter and at phase transitions. Our high-accuracy ground state representations at reasonable computational costs promise new insights into general large-scale quantum many-body systems. Ground state representations with artificial neural network methods enable high-accuracy simulations of quantum many-body systems. The authors study the performance of the transformer network architecture on this task and demonstrate its vast potential for novel findings in quantum physics.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Variational Monte Carlo with large patched transformers
    Kyle Sprague
    Stefanie Czischek
    [J]. Communications Physics, 7
  • [2] Optimizing large parameter sets in variational quantum Monte Carlo
    Neuscamman, Eric
    Umrigar, C. J.
    Chan, Garnet Kin-Lic
    [J]. PHYSICAL REVIEW B, 2012, 85 (04)
  • [3] Variational Consensus Monte Carlo
    Rabinovich, Maxim
    Angelino, Elaine
    Jordan, Michael I.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [4] Variational Sequential Monte Carlo
    Naesseth, Christian A.
    Linderman, Scott W.
    Ranganath, Rajesh
    Blei, David M.
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [5] Variational Monte Carlo in solids
    Fahy, S
    [J]. QUANTUM MONTE CARLO METHODS IN PHYSICS AND CHEMISTRY, 1999, 525 : 101 - 127
  • [6] Streaming Variational Monte Carlo
    Zhao, Yuan
    Nassar, Josue
    Jordan, Ian
    Bugallo, Monica
    Park, Il Memming
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (01) : 1150 - 1161
  • [7] Variational Bayesian Monte Carlo
    Acerbi, Luigi
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [8] A Blocked Linear Method for Optimizing Large Parameter Sets in Variational Monte Carlo
    Zhao, Luning
    Neuscamman, Eric
    [J]. JOURNAL OF CHEMICAL THEORY AND COMPUTATION, 2017, 13 (06) : 2604 - 2611
  • [9] Variational Bayes on Monte Carlo Steroids
    Grover, Aditya
    Ermon, Stefano
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [10] Delayed rejection variational Monte Carlo
    Bressanini, D
    Morosi, G
    Tarasco, S
    Mira, A
    [J]. JOURNAL OF CHEMICAL PHYSICS, 2004, 121 (08): : 3446 - 3451