Transfer Learning for Quantum Classifiers: An Information-Theoretic Generalization Analysis

被引:2
|
作者
Jose, Sharu Theresa [1 ]
Simeone, Osvaldo [2 ]
机构
[1] Univ Birmingham, Dept Comp Sci, Birmingham, England
[2] Kings Coll London, Dept Engn, London, England
基金
欧洲研究理事会; 英国工程与自然科学研究理事会;
关键词
D O I
10.1109/ITW55543.2023.10160236
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A key component of a quantum machine learning model operating on classical inputs is the design of an embedding circuit mapping inputs to a quantum state. This paper studies a transfer learning setting in which classical-to-quantum embedding is carried out by an arbitrary parametric quantum circuit that is pre-trained based on data from a source task. At run time, a binary quantum classifier of the embedding is optimized based on data from the target task of interest. The average excess risk, i.e., the optimality gap, of the resulting classifier depends on how (dis)similar the source and target tasks are. We introduce a new measure of (dis)similarity between the binary quantum classification tasks via the trace distances. An upper bound on the optimality gap is derived in terms of the proposed task (dis)similarity measure, two Renyi mutual information terms between classical input and quantum embedding under source and target tasks, as well as a measure of complexity of the combined space of quantum embeddings and classifiers under the source task. The theoretical results are validated on a simple binary classification example.
引用
收藏
页码:532 / 537
页数:6
相关论文
共 50 条
  • [1] On the Generalization for Transfer Learning: An Information-Theoretic Analysis
    Wu, Xuetong
    Manton, Jonathan H.
    Aickelin, Uwe
    Zhu, Jingge
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (10) : 7089 - 7124
  • [2] Information-theoretic analysis for transfer learning
    Wu, Xuetong
    Manton, Jonathan H.
    Aickelin, Uwe
    Zhu, Jingge
    2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 2819 - 2824
  • [3] Information-theoretic analysis of generalization capability of learning algorithms
    Xu, Aolin
    Raginsky, Maxim
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [4] Generalization Bounds for Meta-Learning: An Information-Theoretic Analysis
    Chen, Qi
    Shui, Changjian
    Marchand, Mario
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [5] Information-Theoretic Generalization Bounds for Batch Reinforcement Learning
    Liu, Xingtu
    ENTROPY, 2024, 26 (11)
  • [6] Information-Theoretic Bounds on the Moments of the Generalization Error of Learning Algorithms
    Aminian, Gholamali
    Toni, Laura
    Rodrigues, Miguel R. D.
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 682 - 687
  • [7] Information-Theoretic Generalization Bounds for Meta-Learning and Applications
    Jose, Sharu Theresa
    Simeone, Osvaldo
    ENTROPY, 2021, 23 (01) : 1 - 28
  • [8] Selection of classifiers using information-theoretic criteria
    Kang, HJ
    PATTERN RECOGNITION AND DATA MINING, PT 1, PROCEEDINGS, 2005, 3686 : 478 - 487
  • [9] Information-Theoretic Bounds on Quantum Advantage in Machine Learning
    Huang, Hsin-Yuan
    Kueng, Richard
    Preskill, John
    PHYSICAL REVIEW LETTERS, 2021, 126 (19)
  • [10] Information-Theoretic Transfer Learning Framework for Bayesian Optimisation
    Ramachandran, Anil
    Gupta, Sunil
    Rana, Santu
    Venkatesh, Svetha
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2018, PT II, 2019, 11052 : 827 - 842