On fundamental aspects of quantum extreme learning machines

被引:0
|
作者
Xiong, Weijie [1 ]
Facelli, Giorgio [1 ]
Sahebi, Mehrad [1 ]
Agnel, Owen [2 ]
Chotibut, Thiparat [3 ]
Thanasilp, Supanut [1 ,3 ]
Holmes, Zoe [1 ]
机构
[1] Ecole Polytech Fed Lausanne EPFL, CH-1015 Lausanne, Switzerland
[2] Univ Oxford, Dept Comp Sci, Oxford, England
[3] Chulalongkorn Univ, Fac Sci, Dept Phys, Chula Intelligent & Complex Syst Lab, Bangkok, Thailand
基金
瑞士国家科学基金会;
关键词
D O I
10.1007/s42484-025-00239-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Quantum extreme learning machines (QELMs) have emerged as a promising framework for quantum machine learning. Their appeal lies in the rich feature map induced by the dynamics of a quantum substrate-the quantum reservoir-and the efficient post-measurement training via linear regression. Here, we study the expressivity of QELMs by decomposing the prediction of QELMs into a Fourier series. We show that the achievable Fourier frequencies are determined by the data encoding scheme, while Fourier coefficients depend on both the reservoir and the measurement. Notably, the expressivity of QELMs is fundamentally limited by the number of Fourier frequencies and the number of observables, while the complexity of the prediction hinges on the reservoir. As a cautionary note on scalability, we identify four sources that can lead to the exponential concentration of the observables as the system size grows (randomness, hardware noise, entanglement, and global measurements) and show how this can turn QELMs into useless input-agnostic oracles. In particular, our result on the reservoir-induced concentration strongly indicates that quantum reservoirs drawn from a highly random ensemble make QELM models unscalable. Our analysis elucidates the potential and fundamental limitations of QELMs and lays the groundwork for systematically exploring quantum reservoir systems for other machine learning tasks.
引用
收藏
页数:33
相关论文
共 50 条
  • [31] Correction to: Deep kernel learning in extreme learning machines
    A. L. Afzal
    Nikhitha K. Nair
    S. Asharaf
    Pattern Analysis and Applications, 2021, 24 (1) : 21 - 21
  • [32] Optical Extreme Learning Machines with Atomic Vapors
    Silva, Nuno A.
    Rocha, Vicente
    Ferreira, Tiago D.
    ATOMS, 2024, 12 (02)
  • [33] Knowledge-based extreme learning machines
    Balasundaram, S.
    Gupta, Deepak
    NEURAL COMPUTING & APPLICATIONS, 2016, 27 (06): : 1629 - 1641
  • [34] Feature Selection Methods for Extreme Learning Machines
    Fu, Yanlin
    Wu, Qing
    Liu, Ke
    Gao, Haotian
    AXIOMS, 2022, 11 (09)
  • [35] Extreme Learning Machines for spatial environmental data
    Leuenberger, Michael
    Kanevski, Mikhail
    COMPUTERS & GEOSCIENCES, 2015, 85 : 64 - 73
  • [36] Extreme learning machines: new trends and applications
    DENG ChenWei
    HUANG GuangBin
    XU Jia
    TANG JieXiong
    Science China(Information Sciences), 2015, 58 (02) : 5 - 20
  • [37] Convolutional Extreme Learning Machines: A Systematic Review
    Rodrigues, Iago Richard
    da Silva Neto, Sebastiao Rogerio
    Kelner, Judith
    Sadok, Djamel
    Endo, Patricia Takako
    INFORMATICS-BASEL, 2021, 8 (02):
  • [38] Batch Intrinsic Plasticity for Extreme Learning Machines
    Neumann, Klaus
    Steil, Jochen J.
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 339 - 346
  • [39] Object tracking with collaborative extreme learning machines
    Haipeng Kuang
    Liang Xun
    Multimedia Tools and Applications, 2020, 79 : 4965 - 4988
  • [40] Learning to Stabilize Extreme Neural Machines with Metaplasticity
    Boucher-Routhier, Megan
    Pilzak, Artem
    Charbonneau, Annie Theberge
    Thivierge, Jean-Philippe
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,