Knowledge-Enhanced Prompt Learning for Few-Shot Text Classification

被引:0
|
作者
Liu, Jinshuo [1 ]
Yang, Lu [1 ]
机构
[1] Wuhan Univ, Sch Cyber Sci & Engn, Key Lab Aerosp Informat Secur & Trusted Comp, Minist Educ, Wuhan 430072, Peoples R China
基金
中国国家自然科学基金;
关键词
text classification; prompt learning; knowledge enhancement; few-shot learning;
D O I
10.3390/bdcc8040043
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Classification methods based on fine-tuning pre-trained language models often require a large number of labeled samples; therefore, few-shot text classification has attracted considerable attention. Prompt learning is an effective method for addressing few-shot text classification tasks in low-resource settings. The essence of prompt tuning is to insert tokens into the input, thereby converting a text classification task into a masked language modeling problem. However, constructing appropriate prompt templates and verbalizers remains challenging, as manual prompts often require expert knowledge, while auto-constructing prompts is time-consuming. In addition, the extensive knowledge contained in entities and relations should not be ignored. To address these issues, we propose a structured knowledge prompt tuning (SKPT) method, which is a knowledge-enhanced prompt tuning approach. Specifically, SKPT includes three components: prompt template, prompt verbalizer, and training strategies. First, we insert virtual tokens into the prompt template based on open triples to introduce external knowledge. Second, we use an improved knowledgeable verbalizer to expand and filter the label words. Finally, we use structured knowledge constraints during the training phase to optimize the model. Through extensive experiments on few-shot text classification tasks with different settings, the effectiveness of our model has been demonstrated.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] ContrastNet: A Contrastive Learning Framework for Few-Shot Text Classification
    Chen, Junfan
    Zhang, Richong
    Mao, Yongyi
    Xu, Jie
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 10492 - 10500
  • [22] Mutual Learning Prototype Network for Few-Shot Text Classification
    Liu, Jun
    Qin, Xiaorui
    Tao, Jian
    Dong, Hongfei
    Li, Xiaoxu
    [J]. Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2024, 47 (03): : 30 - 35
  • [23] Ontology-enhanced Prompt-tuning for Few-shot Learning
    Ye, Hongbin
    Zhang, Ningyu
    Deng, Shumin
    Chen, Xiang
    Chen, Hui
    Xiong, Feiyu
    Chen, Xi
    Chen, Huajun
    [J]. PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 778 - 787
  • [24] KPT plus plus : Refined knowledgeable prompt tuning for few-shot text classification
    Ni, Shiwen
    Kao, Hung-Yu
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 274
  • [25] Few-shot class incremental learning via prompt transfer and knowledge distillation
    Akmel, Feidu
    Meng, Fanman
    Liu, Mingyu
    Zhang, Runtong
    Teka, Asebe
    Lemuye, Elias
    [J]. IMAGE AND VISION COMPUTING, 2024, 151
  • [26] Few-Shot Image Classification Method Based on Visual Language Prompt Learning
    Li, Baoan
    Wang, Xinyu
    Teng, Shangzhi
    Lyu, Xueqiang
    [J]. Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2024, 47 (02): : 11 - 17
  • [27] Causal representation for few-shot text classification
    Yang, Maoqin
    Zhang, Xuejie
    Wang, Jin
    Zhou, Xiaobing
    [J]. APPLIED INTELLIGENCE, 2023, 53 (18) : 21422 - 21432
  • [28] Adversarial training for few-shot text classification
    Croce, Danilo
    Castellucci, Giuseppe
    Basili, Roberto
    [J]. INTELLIGENZA ARTIFICIALE, 2020, 14 (02) : 201 - 214
  • [29] Causal representation for few-shot text classification
    Maoqin Yang
    Xuejie Zhang
    Jin Wang
    Xiaobing Zhou
    [J]. Applied Intelligence, 2023, 53 : 21422 - 21432
  • [30] Knowledge Distillation Meets Few-Shot Learning: An Approach for Few-Shot Intent Classification Within and Across Domains
    Sauer, Anna
    Asaadi, Shima
    Kuech, Fabian
    [J]. PROCEEDINGS OF THE 4TH WORKSHOP ON NLP FOR CONVERSATIONAL AI, 2022, : 108 - 119