Quantum self-attention neural networks for text classification

被引:10
|
作者
Li, Guangxi [1 ,2 ]
Zhao, Xuanqiang [1 ,3 ]
Wang, Xin [1 ,4 ]
机构
[1] Baidu Res, Inst Quantum Comp, Beijing 100193, Peoples R China
[2] Univ Technol Sydney, Ctr Quantum Software & Informat, Sydney, NSW 2007, Australia
[3] Univ Hong Kong, Dept Comp Sci, Quantum Informat & Computat Initiat QICI, Hong Kong 999077, Peoples R China
[4] Hong Kong Univ Sci & Technol Guangzhou, Thrust Artificial Intelligence, Informat Hub, Guangzhou 511453, Peoples R China
基金
澳大利亚研究理事会;
关键词
quantum neural networks; self-attention; natural language processing; text classification; parameterized quantum circuits;
D O I
10.1007/s11432-023-3879-7
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including natural language processing (NLP). Although some efforts based on syntactic analysis have opened the door to research in quantum NLP (QNLP), limitations such as heavy syntactic preprocessing and syntax-dependent network architecture make them impracticable on larger and real-world data sets. In this paper, we propose a new simple network architecture, called the quantum self-attention neural network (QSANN), which can compensate for these limitations. Specifically, we introduce the self-attention mechanism into quantum neural networks and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention. As a result, QSANN is effective and scalable on larger data sets and has the desirable property of being implementable on near-term quantum devices. In particular, our QSANN outperforms the best existing QNLP model based on syntactic analysis as well as a simple classical self-attention neural network in numerical experiments of text classification tasks on public data sets. We further show that our method exhibits robustness to low-level quantum noises and showcases resilience to quantum neural network architectures.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Self-Attention Generative Adversarial Networks
    Zhang, Han
    Goodfellow, Ian
    Metaxas, Dimitris
    Odena, Augustus
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [42] A Study on the Classification of Cancers with Lung Cancer Pathological Images Using Deep Neural Networks and Self-Attention Structures
    Kim, Seung Hyun
    Kang, Ho Chul
    JOURNAL OF POPULATION THERAPEUTICS AND CLINICAL PHARMACOLOGY, 2023, 30 (06): : E374 - E383
  • [43] SAFSN: A Self-Attention Based Neural Network for Encrypted Mobile Traffic Classification
    Zhang, Chengyuan
    An, Changqing
    Wang, Jessie Hui
    Zhao, Ziyi
    Yu, Tao
    Wang, Jilong
    IEEE CONGRESS ON CYBERMATICS / 2021 IEEE INTERNATIONAL CONFERENCES ON INTERNET OF THINGS (ITHINGS) / IEEE GREEN COMPUTING AND COMMUNICATIONS (GREENCOM) / IEEE CYBER, PHYSICAL AND SOCIAL COMPUTING (CPSCOM) / IEEE SMART DATA (SMARTDATA), 2021, : 330 - 337
  • [44] A light-weight quantum self-attention model for classical data classification
    Zhang, Hui
    Zhao, Qinglin
    Chen, Chuangtao
    APPLIED INTELLIGENCE, 2024, 54 (04) : 3077 - 3091
  • [45] Self-Attention Networks for Code Search
    Fang, Sen
    Tan, You-Shuai
    Zhang, Tao
    Liu, Yepang
    INFORMATION AND SOFTWARE TECHNOLOGY, 2021, 134
  • [46] Quantum entanglement and self-attention neural networks: An investigation into passengers and stops characteristics for optimal bus stop localization
    Liu, Hongjie
    Yuan, Tengfei
    Zhang, Xinhuan
    Xu, Hongzhe
    INFORMATION FUSION, 2024, 112
  • [47] Modeling Localness for Self-Attention Networks
    Yang, Baosong
    Tu, Zhaopeng
    Wong, Derek F.
    Meng, Fandong
    Chao, Lidia S.
    Zhang, Tong
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4449 - 4458
  • [48] A light-weight quantum self-attention model for classical data classification
    Hui Zhang
    Qinglin Zhao
    Chuangtao Chen
    Applied Intelligence, 2024, 54 : 3077 - 3091
  • [49] Leveraging contextual embeddings and self-attention neural networks with bi-attention for sentiment analysis
    Magdalena Biesialska
    Katarzyna Biesialska
    Henryk Rybinski
    Journal of Intelligent Information Systems, 2021, 57 : 601 - 626
  • [50] Unifying topological structure and self-attention mechanism for node classification in directed networks
    Peng, Yue
    Xia, Jiwen
    Liu, Dafeng
    Liu, Miao
    Xiao, Long
    Shi, Benyun
    SCIENTIFIC REPORTS, 2025, 15 (01):