Adaptable Adapters

被引:0
|
作者
Moosavi, Nafise Sadat [1 ]
Delfosse, Quentin [3 ]
Kersting, Kristian [2 ,3 ]
Gurevych, Iryna [2 ,4 ]
机构
[1] Univ Sheffield, Dept Comp Sci, Sheffield, S Yorkshire, England
[2] Tech Univ Darmstadt, Hessian Ctr AI Hessian AI, Darmstadt, Germany
[3] Tech Univ Darmstadt, AI & Machine Learning Lab, Darmstadt, Germany
[4] Tech Univ Darmstadt, Ubiquitous Knowledge Proc Lab UKP Lab, Dept Comp Sci, Darmstadt, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
State-of-the-art pretrained NLP models contain a hundred million to trillion parameters. Adapters provide a parameter-efficient alternative for the full finetuning in which we can only finetune lightweight neural network layers on top of pretrained weights. Adapter layers are initialized randomly. However, existing work uses the same adapter architecture-i.e., the same adapter layer on top of each layer of the pretrained model-for every dataset, regardless of the properties of the dataset or the amount of available training data. In this work, we introduce adaptable adapters that contain (1) learning different activation functions for different layers and different input data, and (2) a learnable switch to select and only use the beneficial adapter layers. We show that adaptable adapters achieve on-par performances with the standard adapter architecture while using a considerably smaller number of adapter layers. In addition, we show that the selected adapter architecture by adaptable adapters transfers well across different data settings and similar tasks. We propose to use adaptable adapters for designing efficient and effective adapter architectures. The resulting adapters (a) contain about 50% of the learning parameters of the standard adapter and are therefore more efficient at training and inference, and require less storage space, and (b) achieve considerably higher performances in low-data settings.(1)
引用
收藏
页码:3742 / 3753
页数:12
相关论文
共 50 条
  • [1] Learned Adapters Are Better Than Manually Designed Adapters
    Zhang, Yuming
    Wang, Peng
    Tan, Ming
    Zhu, Wei
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 7420 - 7437
  • [2] Early adapters
    Kessler, Rebecca
    NATURAL HISTORY, 2007, 116 (03) : 13 - 13
  • [3] ADAPTERS FOR MICROCATHETERS
    WENDELL, A
    ANAESTHESIA, 1991, 46 (10) : 896 - 896
  • [4] Beware the adapters
    Hayton, Bruce
    RESPIRATORY CARE, 2007, 52 (12) : 1782 - 1782
  • [5] EARLY ADAPTERS
    McKee, Bradford
    LANDSCAPE ARCHITECTURE MAGAZINE, 2019, 109 (01) : 12 - 12
  • [6] Adapters for metallurgical equipment
    Artiukh, Viktor
    Mazur, Vladlen
    Kargin, Sergey
    Zakharova, Lidya
    INTERNATIONAL SCIENCE CONFERENCE SPBWOSCE-2017 BUSINESS TECHNOLOGIES FOR SUSTAINABLE URBAN DEVELOPMENT, 2018, 170
  • [7] PITLESS ADAPTERS AND PROGRESS
    SCOTT, EA
    GROUND WATER AGE, 1973, 7 (08): : 52 - 52
  • [8] Adapters in lymphocyte signalling
    Leo, A
    Schraven, B
    CURRENT OPINION IN IMMUNOLOGY, 2001, 13 (03) : 307 - 316
  • [9] Output iterator adapters
    Dewhurst, Steve
    C/C++ Users Journal, 2002, 20 (02): : 58 - 60
  • [10] NEW ADAPTERS AVAILABLE
    ISAACS, H
    CRANSTON, IW
    HYDRAULICS & PNEUMATICS, 1973, 26 (02) : 10 - 10