Scrambling ability of quantum neural network architectures

被引:17
|
作者
Wu, Yadong [1 ]
Zhang, Pengfei [2 ,3 ]
Zhai, Hui [1 ]
机构
[1] Tsinghua Univ, Inst Adv Study, Beijing 100084, Peoples R China
[2] CALTECH, Inst Quantum Informat & Matter, Pasadena, CA 91125 USA
[3] Walter Burke Inst Theoret Phys, CALTECH, Pasadena, CA 91125 USA
来源
PHYSICAL REVIEW RESEARCH | 2021年 / 3卷 / 03期
关键词
Quantum optics - Machine learning - Network architecture - Efficiency - Personnel training - Qubits;
D O I
10.1103/PhysRevResearch.3.L032057
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In this Letter, we propose a guiding principle for how to design the architecture of a quantum neural network in order to achieve a high learning efficiency. This principle is inspired by the equivalence between extracting information from the input state to the readout qubit and scrambling information from the readout qubit to input qubits. We characterize the quantum information scrambling by operator size growth. By Haar random averaging over operator sizes, we propose an averaged operator size to describe the information scrambling ability of a given quantum neural network architecture. The key conjecture of this Letter is that this quantity is positively correlated with the learning efficiency of this architecture. To support this conjecture, we consider several different architectures, and we also consider two typical learning tasks. One is a regression task of a quantum problem, and the other is a classification task on classical images. In both cases, we find that, for the architecture with a larger averaged operator size, the loss function decreases faster or the prediction accuracy increases faster as the training epoch increases, which means higher learning efficiency. Our results can be generalized to more complicated quantum versions of machine learning algorithms.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Quantum artificial neural network architectures and components
    Narayanan, A
    Menneer, T
    INFORMATION SCIENCES, 2000, 128 (3-4) : 231 - 255
  • [2] Analysis of quantum neural network learning ability
    Zhong, Yan-hua
    Man, Chang-qing
    APPLIED MATHEMATICS & INFORMATION SCIENCES, 2013, 7 (02): : 679 - 683
  • [3] Quantifying scrambling in quantum neural networks
    Garcia, Roy J.
    Bu, Kaifeng
    Jaffe, Arthur
    JOURNAL OF HIGH ENERGY PHYSICS, 2022, 2022 (03)
  • [4] Information Scrambling in Quantum Neural Networks
    Shen, Huitao
    Zhang, Pengfei
    You, Yi-Zhuang
    Zhai, Hui
    PHYSICAL REVIEW LETTERS, 2020, 124 (20)
  • [5] Quantifying scrambling in quantum neural networks
    Roy J. Garcia
    Kaifeng Bu
    Arthur Jaffe
    Journal of High Energy Physics, 2022
  • [6] Optimizing quantum convolutional neural network architectures for arbitrary data dimension
    Lee, Changwon
    Araujo, Israel F.
    Kim, Dongha
    Lee, Junghan
    Park, Siheon
    Ryu, Ju-Young
    Park, Daniel K.
    FRONTIERS IN PHYSICS, 2025, 13
  • [7] Neural network Architectures and learning
    Wilamowski, BM
    2003 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY, VOLS 1 AND 2, PROCEEDINGS, 2003, : TU1 - TU12
  • [8] Lattices of neural network architectures
    Holena, Martin
    Neural Network World, 1994, 4 (04) : 435 - 464
  • [9] Initialization of neural network architectures
    Kubat, M
    Koprinska, I
    SECOND INTERNATIONAL CONFERENCE ON NONLINEAR PROBLEMS IN AVIATION & AEROSPACE VOL 1 AND 2, 1999, : 373 - 380
  • [10] ON THE CHOICE OF GRAPH NEURAL NETWORK ARCHITECTURES
    Vignac, Clement
    Ortiz-Jimenez, Guillermo
    Frossard, Pascal
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 8489 - 8493