The Power of Prompt Tuning for Low-Resource Semantic Parsing

被引:0
|
作者
Schucher, Nathan [1 ,2 ,3 ]
Reddy, Siva [2 ,3 ,4 ]
de Vries, Harm [1 ]
机构
[1] ServiceNow Res, Santa Clara, CA 95054 USA
[2] Mila, Montreal, PQ, Canada
[3] McGill Univ, Montreal, PQ, Canada
[4] Facebook CIFAR AI Chair, Toronto, ON, Canada
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Prompt tuning has recently emerged as an effective method for adapting pre-trained language models to a number of language understanding and generation tasks. In this paper, we investigate prompt tuning for semantic parsing-the task of mapping natural language utterances onto formal meaning representations. On the low-resource splits of Overnight and TOPv2, we find that a prompt tuned T5-xl significantly outperforms its fine-tuned counterpart, as well as strong GPT-3 and BART baselines. We also conduct ablation studies across different model scales and target representations, finding that, with increasing model scale, prompt tuned T5 models improve at generating target representations that are far from the pre-training distribution.
引用
收藏
页码:148 / 156
页数:9
相关论文
共 50 条
  • [31] An Adversarial Joint Learning Model for Low-Resource Language Semantic Textual Similarity
    Tian, Junfeng
    Lan, Man
    Wu, Yuanbin
    Wang, Jingang
    Qiu, Long
    Li, Sheng
    Jun, Lang
    Si, Luo
    [J]. ADVANCES IN INFORMATION RETRIEVAL (ECIR 2018), 2018, 10772 : 89 - 101
  • [32] AgglutiFiT: Efficient Low-Resource Agglutinative Language Model Fine-Tuning
    Li, Zhe
    Li, Xiuhong
    Sheng, Jiabao
    Slamu, Wushour
    [J]. IEEE ACCESS, 2020, 8 : 148489 - 148499
  • [33] A Prompt-Based Topic-Modeling Method for Depression Detection on Low-Resource Data
    Guo, Yanrong
    Liu, Jilong
    Wang, Lei
    Qin, Wei
    Hao, Shijie
    Hong, Richang
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, 11 (01) : 1430 - 1439
  • [34] KeyEE: Enhancing Low-Resource Generative Event Extraction with Auxiliary Keyword Sub-Prompt
    Duan, Junwen
    Liao, Xincheng
    An, Ying
    Wang, Jianxin
    [J]. BIG DATA MINING AND ANALYTICS, 2024, 7 (02): : 547 - 560
  • [35] Low-resource multi-granularity academic function recognition based on multiple prompt knowledge
    Liu, Jiawei
    Xiong, Zi
    Jiang, Yi
    Ma, Yongqiang
    Lu, Wei
    Huang, Yong
    Cheng, Qikai
    [J]. ELECTRONIC LIBRARY, 2024,
  • [36] PTSTEP: Prompt Tuning for Semantic Typing of Event Processes
    Zhu, Wenhao
    Xu, Yongxiu
    Xu, Hongbo
    Tang, Minghao
    Zhu, Dongwei
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT III, 2023, 14256 : 541 - 553
  • [37] The Low-Resource Double Bind: An Empirical Study of Pruning for Low-Resource Machine Translation
    Ahia, Orevaoghene
    Kreutzer, Julia
    Hooker, Sara
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 3316 - 3333
  • [38] Multilingual Dependency Parsing for Low-Resource Languages: Case Studies on North Saami and Komi-Zyrian
    Lim, KyungTae
    Partanen, Niko
    Poibeau, Thierry
    [J]. PROCEEDINGS OF THE ELEVENTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2018), 2018, : 2230 - 2235
  • [39] Semantic Self-Segmentation for Abstractive Summarization of Long Documents in Low-Resource Regimes
    Moro, Gianluca
    Ragazzi, Luca
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11085 - 11093
  • [40] Contrastive fine-tuning for low-resource graph-level transfer learning
    Duan, Yutai
    Liu, Jie
    Chen, Shaowei
    Wu, Jianhua
    [J]. INFORMATION SCIENCES, 2024, 659