The Power of Prompt Tuning for Low-Resource Semantic Parsing

被引:0
|
作者
Schucher, Nathan [1 ,2 ,3 ]
Reddy, Siva [2 ,3 ,4 ]
de Vries, Harm [1 ]
机构
[1] ServiceNow Res, Santa Clara, CA 95054 USA
[2] Mila, Montreal, PQ, Canada
[3] McGill Univ, Montreal, PQ, Canada
[4] Facebook CIFAR AI Chair, Toronto, ON, Canada
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Prompt tuning has recently emerged as an effective method for adapting pre-trained language models to a number of language understanding and generation tasks. In this paper, we investigate prompt tuning for semantic parsing-the task of mapping natural language utterances onto formal meaning representations. On the low-resource splits of Overnight and TOPv2, we find that a prompt tuned T5-xl significantly outperforms its fine-tuned counterpart, as well as strong GPT-3 and BART baselines. We also conduct ablation studies across different model scales and target representations, finding that, with increasing model scale, prompt tuned T5 models improve at generating target representations that are far from the pre-training distribution.
引用
收藏
页码:148 / 156
页数:9
相关论文
共 50 条
  • [1] Low-Resource Compositional Semantic Parsing with Concept Pretraining
    Rongali, Subendhu
    Sridhar, Mukund
    Khan, Haidar
    Arkoudas, Konstantine
    Hamza, Wael
    McCallum, Andrew
    [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1410 - 1419
  • [2] PromptEM: Prompt-tuning for Low-resource Generalized Entity Matching
    Wang, Pengfei
    Zeng, Xiaocan
    Chen, Lu
    Ye, Fan
    Mao, Yuren
    Zhu, Junhao
    Gao, Yunjun
    [J]. PROCEEDINGS OF THE VLDB ENDOWMENT, 2022, 16 (02): : 369 - 378
  • [3] Low-Resource Domain Adaptation for Compositional Task-Oriented Semantic Parsing
    Chen, Xilun
    Ghoshal, Asish
    Mehdad, Yashar
    Zettlemoyer, Luke
    Gupta, Sonal
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 5090 - 5100
  • [4] Systematic Investigation of Strategies Tailored for Low-Resource Settings for Low-Resource Dependency Parsing
    Sandhan, Jivnesh
    Behera, Laxmidhar
    Goyal, Pawan
    [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 2164 - 2171
  • [5] A systematic comparison of methods for low-resource dependency parsing on genuinely low-resource languages
    Vania, Clara
    Kementchedjhieva, Yova
    Sogaard, Anders
    Lopez, Adam
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 1105 - 1116
  • [6] Multi-Stage Prompt Tuning for Political Perspective Detection in Low-Resource Settings
    Kim, Kang-Min
    Lee, Mingyu
    Won, Hyun-Sik
    Kim, Min-Ji
    Kim, Yeachan
    Lee, SangKeun
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (10):
  • [7] Neural Semantic Parsing in Low-Resource Settings with Back-Translation and Meta-Learning
    Sun, Yibo
    Tang, Duyu
    Duan, Nan
    Gong, Yeyun
    Feng, Xiaocheng
    Qin, Bing
    Jiang, Daxin
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 8960 - 8967
  • [8] Low-Resource Semantic Role Labeling
    Gormley, Matthew R.
    Mitchell, Margaret
    Van Durme, Benjamin
    Dredze, Mark
    [J]. PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2014, : 1177 - 1187
  • [9] Self-PT: Adaptive Self-Prompt Tuning for Low-Resource Visual Question Answering
    Yuan, Bowen
    You, Sisi
    Bao, Bing-Kun
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 5089 - 5098
  • [10] Prompt-based for Low-Resource Tibetan Text Classification
    An, Bo
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (08)