AdapterHub Playground: Simple and Flexible Few-Shot Learning with Adapters

被引:0
|
作者
Beck, Tilman [1 ]
Bohlender, Bela [1 ]
Viehmann, Christina [2 ]
Hane, Vincent [1 ]
Adamson, Yanik [1 ]
Khuri, Jaber [1 ]
Brossmann, Jonas [1 ]
Pfeiffer, Jonas [1 ]
Gurevych, Iryna [1 ]
机构
[1] Tech Univ Darmstadt, Dept Comp Sci, Ubiquitous Knowledge Proc Lab UKP Lab, Darmstadt, Germany
[2] Johannes Gutenberg Univ Mainz, Inst Publizist, Mainz, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The open-access dissemination of pretrained language models through online repositories has led to a democratization of state-of-the-art natural language processing (NLP) research. This also allows people outside of NLP to use such models and adapt them to specific use-cases. However, a certain amount of technical proficiency is still required which is an entry barrier for users who want to apply these models to a certain task but lack the necessary knowledge or resources. In this work, we aim to overcome this gap by providing a tool which allows researchers to leverage pretrained models without writing a single line of code. Built upon the parameter-efficient adapter modules for transfer learning, our AdapterHub Playground provides an intuitive interface, allowing the usage of adapters for prediction, training and analysis of textual data for a variety of NLP tasks. We present the tool's architecture and demonstrate its advantages with prototypical use-cases, where we show that predictive performance can easily be increased in a few-shot learning scenario. Finally, we evaluate its usability in a user study. We provide the code and a live interface(1).
引用
收藏
页码:61 / 75
页数:15
相关论文
共 50 条
  • [41] Prototype Reinforcement for Few-Shot Learning
    Xu, Liheng
    Xie, Qian
    Jiang, Baoqing
    Zhang, Jiashuo
    2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 4912 - 4916
  • [42] Learning about few-shot concept learning
    Rastogi, Ananya
    NATURE COMPUTATIONAL SCIENCE, 2022, 2 (11): : 698 - 698
  • [43] An Applicative Survey on Few-shot Learning
    Zhang J.
    Zhang X.
    Lv L.
    Di Y.
    Chen W.
    Recent Patents on Engineering, 2022, 16 (05) : 104 - 124
  • [44] Secure collaborative few-shot learning
    Xie, Yu
    Wang, Han
    Yu, Bin
    Zhang, Chen
    KNOWLEDGE-BASED SYSTEMS, 2020, 203
  • [45] Prototypical Networks for Few-shot Learning
    Snell, Jake
    Swersky, Kevin
    Zemel, Richard
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [46] Demystification of Few-shot and One-shot Learning
    Tyukin, Ivan Y.
    Gorban, Alexander N.
    Alkhudaydi, Muhammad H.
    Zhou, Qinghua
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [47] Frustratingly simple few-shot slot tagging
    Ma, Jianqiang
    Yan, Zeyu
    Li, Chang
    Zhang, Yang
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 1028 - 1033
  • [48] Meta-Baseline: Exploring Simple Meta-Learning for Few-Shot Learning
    Chen, Yinbo
    Liu, Zhuang
    Xu, Huijuan
    Darrell, Trevor
    Wang, Xiaolong
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9042 - 9051
  • [49] Simple yet effective joint guidance learning for few-shot semantic segmentation
    Zhaobin Chang
    Yonggang Lu
    Xingcheng Ran
    Xiong Gao
    Hong Zhao
    Applied Intelligence, 2023, 53 : 26603 - 26621
  • [50] Cutting Down on Prompts and Parameters: Simple Few-Shot Learning with Language Models
    Logan, Robert L.
    Balazevic, Ivana
    Wallace, Eric
    Petroni, Fabio
    Singh, Sameer
    Riedel, Sebastian
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 2824 - 2835