Exponential concentration in quantum kernel methods

被引:14
|
作者
Thanasilp, Supanut [1 ,2 ,3 ]
Wang, Samson [4 ]
Cerezo, M. [5 ,6 ]
Holmes, Zoe [2 ,5 ]
机构
[1] Natl Univ Singapore, Ctr Quantum Technol, 3 Sci Dr 2, Singapore, Singapore
[2] Ecole Polytech Fed Lausanne EPFL, Inst Phys, Lausanne, Switzerland
[3] Chulalongkorn Univ, Fac Sci, Dept Phys, Chula Intelligent & Complex Syst, Bangkok, Thailand
[4] Imperial Coll London, London, England
[5] Los Alamos Natl Lab, Informat Sci, Los Alamos, NM 87545 USA
[6] Quantum Sci Ctr, Oak Ridge, TN USA
基金
新加坡国家研究基金会;
关键词
D O I
10.1038/s41467-024-49287-w
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Kernel methods in Quantum Machine Learning (QML) have recently gained significant attention as a potential candidate for achieving a quantum advantage in data analysis. Among other attractive properties, when training a kernel-based model one is guaranteed to find the optimal model's parameters due to the convexity of the training landscape. However, this is based on the assumption that the quantum kernel can be efficiently obtained from quantum hardware. In this work we study the performance of quantum kernel models from the perspective of the resources needed to accurately estimate kernel values. We show that, under certain conditions, values of quantum kernels over different input data can be exponentially concentrated (in the number of qubits) towards some fixed value. Thus on training with a polynomial number of measurements, one ends up with a trivial model where the predictions on unseen inputs are independent of the input data. We identify four sources that can lead to concentration including expressivity of data embedding, global measurements, entanglement and noise. For each source, an associated concentration bound of quantum kernels is analytically derived. Lastly, we show that when dealing with classical data, training a parametrized data embedding with a kernel alignment method is also susceptible to exponential concentration. Our results are verified through numerical simulations for several QML tasks. Altogether, we provide guidelines indicating that certain features should be avoided to ensure the efficient evaluation of quantum kernels and so the performance of quantum kernel methods. Quantum kernel methods are usually believed to enjoy better trainability than quantum neural networks which may suffer from a well-studied barren plateau. Here, building over previous evidence, the authors show that practical implications of exponential concentration result in a trivial data-insensitive model after training, and identify commonly used features that induce the concentration.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Image Segmentation Based On Exponential Kernel Function
    Yang, Dingding
    Tang, Liming
    Chen, Shiqiang
    Li, Jun
    2017 14TH INTERNATIONAL SYMPOSIUM ON PERVASIVE SYSTEMS, ALGORITHMS AND NETWORKS & 2017 11TH INTERNATIONAL CONFERENCE ON FRONTIER OF COMPUTER SCIENCE AND TECHNOLOGY & 2017 THIRD INTERNATIONAL SYMPOSIUM OF CREATIVE COMPUTING (ISPAN-FCST-ISCC), 2017, : 293 - 298
  • [42] Quantum exponential function
    Woronowicz, SL
    REVIEWS IN MATHEMATICAL PHYSICS, 2000, 12 (06) : 873 - 920
  • [43] Exponential divergence estimates and heat kernel tail
    Nualart, E
    COMPTES RENDUS MATHEMATIQUE, 2004, 338 (01) : 77 - 80
  • [44] Using Exponential Kernel for Word Sense Disambiguation
    Wang, Tinghua
    Rao, Junyang
    Zhao, Dongyan
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2013, 2013, 8131 : 545 - 552
  • [45] Quantum classifier with tailored quantum kernel
    Blank, Carsten
    Park, Daniel K.
    Rhee, June-Koo Kevin
    Petruccione, Francesco
    NPJ QUANTUM INFORMATION, 2020, 6 (01)
  • [46] Quantum classifier with tailored quantum kernel
    Carsten Blank
    Daniel K. Park
    June-Koo Kevin Rhee
    Francesco Petruccione
    npj Quantum Information, 6
  • [47] Quantum tangent kernel
    Shirai, Norihito
    Kubo, Kenji
    Mitarai, Kosuke
    Fujii, Keisuke
    PHYSICAL REVIEW RESEARCH, 2024, 6 (03):
  • [48] Exponential concentration of cover times
    Zhai, Alex
    ELECTRONIC JOURNAL OF PROBABILITY, 2018, 23
  • [49] Eigenvalues Ratio for Kernel Selection of Kernel Methods
    Liu, Yong
    Liao, Shizhong
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2814 - 2820
  • [50] Exponential peer methods
    Weiner, Ruediger
    El-Azab, Tamer
    APPLIED NUMERICAL MATHEMATICS, 2012, 62 (10) : 1335 - 1348