Adapter-Based Contextualized Meta Embeddings

被引:0
|
作者
O'Neill, James [1 ]
Dutta, Sourav [2 ]
机构
[1] DynamoFL Ireland, Dublin, Ireland
[2] Huawei Ireland Res Ctr, Dublin, Ireland
关键词
LoRA; Adapter; Meta Embedding; Multilingual;
D O I
10.1007/978-981-97-6125-8_7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces MetaLoRA and MetaUniPELT, two meta-embedding approaches that extends Low Rank Adaptation (LoRA) and adapters for fine-tuning and combining multiple pretrained models. We find that both models improve performance across a range of monolingual and multilingual tasks, outperforming baselines such as fully fine-tuned single models, simple concatenation of pretrained embeddings with classification layer fine-tuning and soft-voting ensembles. On the XGLUE benchmark, we find a 1.7 test score increase over the best fully-fine tuned model and a 0.24 increase over the best fully-fine tuned ensemble on sentence classification tasks. Our results underscore the potential of parameter-efficient fine-tuning of ensembles as efficient and effective alternatives to full fine-tuning and standard ensemble methods.
引用
收藏
页码:82 / 90
页数:9
相关论文
共 50 条
  • [41] CEQE to SQET: A study of contextualized embeddings for query expansion
    Shahrzad Naseri
    Jeffrey Dalton
    Andrew Yates
    James Allan
    Information Retrieval Journal, 2022, 25 : 184 - 208
  • [42] Unsupervised Domain Adaptation of Contextualized Embeddings for Sequence Labeling
    Han, Xiaochuang
    Eisenstein, Jacob
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 4238 - 4248
  • [43] Contextualized Embeddings based Transformer Encoder for Sentence Similarity Modeling in Answer Selection Task
    Laskar, Md Tahmid Rahman
    Huang, Jimmy
    Hoque, Enamul
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 5505 - 5514
  • [44] Deep Contextualized Word Embeddings for Universal Dependency Parsing
    Liu, Yijia
    Che, Wanxiang
    Wang, Yuxuan
    Zheng, Bo
    Qin, Bing
    Liu, Ting
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2020, 19 (01)
  • [45] BRUCE: Bundle Recommendation Using Contextualized item Embeddings
    Brosh, Tzoof Avny
    Livne, Amit
    Shalom, Oren Sar
    Shapira, Bracha
    Last, Mark
    PROCEEDINGS OF THE 16TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2022, 2022, : 237 - 245
  • [46] CEQE to SQET: A study of contextualized embeddings for query expansion
    Naseri, Shahrzad
    Dalton, Jeffrey
    Yates, Andrew
    Allan, James
    INFORMATION RETRIEVAL JOURNAL, 2022, 25 (02): : 184 - 208
  • [47] Examining the effect of whitening on static and contextualized word embeddings
    Sasaki, Shota
    Heinzerling, Benjamin
    Suzuki, Jun
    Inui, Kentaro
    INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (03)
  • [48] Contextualized Word Embeddings Expose Ethnic Biases in News
    Thijs, Guusje
    Trilling, Damian
    Kroon, Anne C.
    16TH ACM WEB SCIENCE CONFERENCE, WEBSCIENCE 2024, 2024, : 290 - 295
  • [49] WAD-X: Improving Zero-shot Cross-lingual Transfer via Adapter-based Word Alignment
    Ahmat, Ahtamjan
    Yang, Yating
    Ma, Bo
    Dong, Rui
    Lu, Kaiwen
    Wang, Lei
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (09)
  • [50] Self-supervised Learning of Contextualized Local Visual Embeddings
    Silva, Thalles
    Pedrini, Helio
    Rivera, Adin Ramirez
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 177 - 186