The Power of Prompt Tuning for Low-Resource Semantic Parsing

被引:0
|
作者
Schucher, Nathan [1 ,2 ,3 ]
Reddy, Siva [2 ,3 ,4 ]
de Vries, Harm [1 ]
机构
[1] ServiceNow Res, Santa Clara, CA 95054 USA
[2] Mila, Montreal, PQ, Canada
[3] McGill Univ, Montreal, PQ, Canada
[4] Facebook CIFAR AI Chair, Toronto, ON, Canada
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Prompt tuning has recently emerged as an effective method for adapting pre-trained language models to a number of language understanding and generation tasks. In this paper, we investigate prompt tuning for semantic parsing-the task of mapping natural language utterances onto formal meaning representations. On the low-resource splits of Overnight and TOPv2, we find that a prompt tuned T5-xl significantly outperforms its fine-tuned counterpart, as well as strong GPT-3 and BART baselines. We also conduct ablation studies across different model scales and target representations, finding that, with increasing model scale, prompt tuned T5 models improve at generating target representations that are far from the pre-training distribution.
引用
收藏
页码:148 / 156
页数:9
相关论文
共 50 条
  • [31] A Linguistic Resource for Semantic Parsing of Motion Events
    Roberts, Kirk
    Gullapalli, Srikanth
    Bejan, Cosmin Adrian
    Harabagiu, Sanda
    [J]. LREC 2010 - SEVENTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2010, : 3293 - 3299
  • [32] Climbing the Tower of Treebanks: Improving Low-Resource Dependency Parsing via Hierarchical Source Selection
    Glavas, Goran
    Vulic, Ivan
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4878 - 4888
  • [33] Cross-Lingual Transfer with Language-Specific Subnetworks for Low-Resource Dependency Parsing
    Choenni, Rochelle
    Garrette, Dan
    Shutova, Ekaterina
    [J]. COMPUTATIONAL LINGUISTICS, 2023, 49 (03) : 613 - 641
  • [34] Knowledge Collaborative Fine-tuning for Low-resource Knowledge Graph Completion
    Zhang, Ning-Yu
    Xie, Xin
    Chen, Xiang
    Deng, Shu-Min
    Ye, Hong-Bin
    Chen, Hua-Jun
    [J]. Ruan Jian Xue Bao/Journal of Software, 2022, 33 (10): : 3531 - 3545
  • [35] AgglutiFiT: Efficient Low-Resource Agglutinative Language Model Fine-Tuning
    Li, Zhe
    Li, Xiuhong
    Sheng, Jiabao
    Slamu, Wushour
    [J]. IEEE ACCESS, 2020, 8 : 148489 - 148499
  • [36] An Adversarial Joint Learning Model for Low-Resource Language Semantic Textual Similarity
    Tian, Junfeng
    Lan, Man
    Wu, Yuanbin
    Wang, Jingang
    Qiu, Long
    Li, Sheng
    Jun, Lang
    Si, Luo
    [J]. ADVANCES IN INFORMATION RETRIEVAL (ECIR 2018), 2018, 10772 : 89 - 101
  • [37] A Systematic Review on Semantic Role Labeling for Information Extraction in Low-Resource Data
    Ariyanto, Amelia Devi Putri
    Purwitasari, Diana
    Fatichah, Chastine
    [J]. IEEE ACCESS, 2024, 12 : 57917 - 57946
  • [38] A Prompt-Based Topic-Modeling Method for Depression Detection on Low-Resource Data
    Guo, Yanrong
    Liu, Jilong
    Wang, Lei
    Qin, Wei
    Hao, Shijie
    Hong, Richang
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, 11 (01) : 1430 - 1439
  • [39] KeyEE: Enhancing Low-Resource Generative Event Extraction with Auxiliary Keyword Sub-Prompt
    Duan, Junwen
    Liao, Xincheng
    An, Ying
    Wang, Jianxin
    [J]. BIG DATA MINING AND ANALYTICS, 2024, 7 (02): : 547 - 560
  • [40] Low-resource multi-granularity academic function recognition based on multiple prompt knowledge
    Liu, Jiawei
    Xiong, Zi
    Jiang, Yi
    Ma, Yongqiang
    Lu, Wei
    Huang, Yong
    Cheng, Qikai
    [J]. ELECTRONIC LIBRARY, 2024,