A Quantum Annealing Instance Selection Approach for Efficient and Effective Transformer Fine-Tuning

被引:0
|
作者
Pasin, Andrea [1 ]
Cunha, Washington [2 ]
Goncalves, Marcos Andre [2 ]
Ferro, Nicola [1 ]
机构
[1] Univ Padua, Padua, Italy
[2] Univ Fed Minas Gerais, Belo Horizonte, MG, Brazil
基金
巴西圣保罗研究基金会;
关键词
Instance Selection; Quantum Computing; Text Classification;
D O I
10.1145/3664190.3672515
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Deep Learning approaches have become pervasive in recent years due to their ability to solve complex tasks. However, these models need huge datasets for proper training and good generalization. This translates into high training and fine-tuning time, even several days for the most complex models and large datasets. In this work, we present a novel quantum Instance Selection (IS) approach that allows to significantly reduce the size of the training datasets (by up to 28%) while maintaining the model's effectiveness, thus promoting (training) speedups and scalability. Our solution is innovative in the sense that it exploits a different computing paradigm - Quantum Annealing (QA) - a specific Quantum Computing paradigm that can be used to tackle optimization problems. To the best of our knowledge, there have been no prior attempts to tackle the IS problem using QA. Furthermore, we propose a new Quadratic Unconstrained Binary Optimization formulation specific for the IS problem, which is a contribution in itself. Through an extensive set of experiments with several Text Classification benchmarks, we empirically demonstrate our quantum solution's feasibility and competitiveness with the current state-of-the-art IS solutions.
引用
收藏
页码:205 / 214
页数:10
相关论文
共 50 条
  • [41] BitFit: Simple Parameter-efficient Fine-tuning for Transformer-based Masked Language-models
    Ben-Zaken, Elad
    Ravfogel, Shauli
    Goldberg, Yoav
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): (SHORT PAPERS), VOL 2, 2022, : 1 - 9
  • [42] How Fine-Tuning Allows for Effective Meta-Learning
    Chua, Kurtland
    Lei, Qi
    Lee, Jason D.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [43] Structured Fine-Tuning of Contextual Embeddings for Effective Biomedical Retrieval
    Ueda, Alberto
    Santos, Rodrygo L. T.
    Macdonald, Craig
    Ounis, Iadh
    SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 2031 - 2035
  • [44] Effective Fine-tuning Methods for Cross-lingual Adaptation
    Yu, Tao
    Joty, Shafiq
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 8492 - 8501
  • [45] Selection and fine-tuning of the autoimmune T-cell repertoire
    Stephen M. Anderton
    David C. Wraith
    Nature Reviews Immunology, 2002, 2 : 487 - 498
  • [46] Examining and Fine-tuning the Selection of Glycan Compositions with GlyConnect Compozitor
    Robin, Thibault
    Mariethoz, Julien
    Lisacek, Frederique
    MOLECULAR & CELLULAR PROTEOMICS, 2020, 19 (10) : 1602 - 1618
  • [47] Selection and fine-tuning of the autoimmune T-CELL repertoire
    Anderton, SM
    Wraith, DC
    NATURE REVIEWS IMMUNOLOGY, 2002, 2 (07) : 487 - 498
  • [48] Anchoring Fine-tuning of Sentence Transformer with Semantic Label Information for Efficient Truly Few-shot Classification
    Pauli, Amalie Brogaard
    Derczynski, Leon
    Assent, Ira
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 11254 - 11264
  • [49] Efficient Latent Space Compression for Lightning-Fast Fine-Tuning and Inference of Transformer-Based Models
    Falaki, Ala Alam
    Gras, Robin
    MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2023, 5 (03): : 847 - 867
  • [50] The Mechanism for Mimicry: Instant Biosemiotic Selection or Gradual Darwinian Fine-Tuning Selection?
    V. N. Alexander
    Biosemiotics, 2019, 12 : 39 - 55