Investigating Meta-Learning Algorithms for Low-Resource Natural Language Understanding Tasks

被引:0
|
作者
Dou, Zi-Yi [1 ]
Yu, Keyi [1 ]
Anastasopoulos, Antonios [1 ]
机构
[1] Carnegie Mellon Univ, Language Technol Inst, Pittsburgh, PA 15213 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning general representations of text is a fundamental problem for many natural language understanding (NLU) tasks. Previously, researchers have proposed to use language model pre-training and multi-task learning to learn robust representations. However, these methods can achieve sub-optimal performance in low-resource scenarios. Inspired by the recent success of optimization-based meta-learning algorithms, in this paper, we explore the model-agnostic meta-learning algorithm (MAML) and its variants for low-resource NLU tasks. We validate our methods on the GLUE benchmark and show that our proposed models can outperform several strong baselines. We further empirically demonstrate that the learned representations can be adapted to new tasks efficiently and effectively.
引用
收藏
页码:1192 / 1197
页数:6
相关论文
共 50 条
  • [1] Meta-Learning for Low-resource Natural Language Generation in Task-oriented Dialogue Systems
    Mi, Fei
    Huang, Minlie
    Zhang, Jiyong
    Faltings, Boi
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3151 - 3157
  • [2] Meta Auxiliary Learning for Low-resource Spoken Language Understanding
    Gao, Yingying
    Feng, Junlan
    Deng, Chao
    Zhang, Shilei
    [J]. INTERSPEECH 2022, 2022, : 2703 - 2707
  • [3] META-LEARNING FOR LOW-RESOURCE SPEECH EMOTION RECOGNITION
    Chopra, Suransh
    Mathur, Puneet
    Sawhney, Ramit
    Shah, Rajiv Ratn
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 6259 - 6263
  • [4] Meta-Learning for Low-Resource Neural Machine Translation
    Gu, Jiatao
    Wang, Yong
    Chen, Yun
    Cho, Kyunghyun
    Li, Victor O. K.
    [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3622 - 3631
  • [5] Lightweight Meta-Learning for Low-Resource Abstractive Summarization
    Huh, Taehun
    Ko, Youngjoong
    [J]. PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 2629 - 2633
  • [6] Language-Agnostic Meta-Learning for Low-Resource Text-to-Speech with Articulatory Features
    Lux, Florian
    Vu, Ngoc Thang
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 6858 - 6868
  • [7] Robust Speech Recognition using Meta-learning for Low-resource Accents
    Eledath, Dhanya
    Baby, Arun
    Singh, Shatrughan
    [J]. 2024 NATIONAL CONFERENCE ON COMMUNICATIONS, NCC, 2024,
  • [8] Knowledge-Aware Meta-learning for Low-Resource Text Classification
    Yao, Huaxiu
    Wu, Yingxin
    Al-Shedivat, Maruan
    Xing, Eric P.
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1814 - 1821
  • [9] Graph-Evolving Meta-Learning for Low-Resource Medical Dialogue Generation
    Lin, Shuai
    Zhou, Pan
    Liang, Xiaodan
    Tang, Jianheng
    Zhao, Ruihui
    Chen, Ziliang
    Lin, Liang
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 13362 - 13370
  • [10] Towards Low-Resource Semi-Supervised Dialogue Generation with Meta-Learning
    Huang, Yi
    Feng, Junlan
    Ma, Shuo
    Du, Xiaoyu
    Wu, Xiaoting
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4123 - 4128