Self-supervised Bidirectional Prompt Tuning for Entity-enhanced Pre-trained Language Model

被引:0
|
作者
Zou, Jiaxin [1 ]
Xu, Xianghong [1 ]
Hou, Jiawei [2 ]
Yang, Qiang [2 ]
Zheng, Hai-Tao [1 ,3 ]
机构
[1] Tsinghua Univ, Shenzhen Int Grad Sch, Shenzhen 518055, Peoples R China
[2] Weixin Grp, Dept Search & Applicat, Tencent, Peoples R China
[3] Pengcheng Lab, Shenzhen 518055, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
10.1109/IJCNN54540.2023.10192045
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the promotion of the pre-training paradigm, researchers are increasingly focusing on injecting external knowledge, such as entities and triplets from knowledge graphs, into pre-trained language models (PTMs) to improve their understanding and logical reasoning abilities. This results in significant improvements in natural language understanding and generation tasks and some level of interpretability. In this paper, we propose a novel two-stage entity knowledge enhancement pipeline for Chinese pre-trained models based on "bidirectional" prompt tuning. The pipeline consists of a "forward" stage, in which we construct fine-grained entity type prompt templates to boost PTMs injected with entity knowledge, and a "backward" stage, where the trained templates are used to generate type-constrained context-dependent negative samples for contrastive learning. Experiments on six classification tasks in the Chinese Language Understanding Evaluation (CLUE) benchmark demonstrate that our approach significantly improves upon the baseline results in most datasets, particularly those that have a strong reliance on diverse and extensive knowledge.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Prompt Tuning for Discriminative Pre-trained Language Models
    Yao, Yuan
    Dong, Bowen
    Zhang, Ao
    Zhang, Zhengyan
    Xie, Ruobing
    Liu, Zhiyuan
    Lin, Leyu
    Sun, Maosong
    Wang, Jianyong
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3468 - 3473
  • [2] SPIQ: A Self-Supervised Pre-Trained Model for Image Quality Assessment
    Chen, Pengfei
    Li, Leida
    Wu, Qingbo
    Wu, Jinjian
    [J]. IEEE Signal Processing Letters, 2022, 29 : 513 - 517
  • [3] Self-supervised pre-trained neural network for quantum natural language processing
    Yao, Ben
    Tiwari, Prayag
    Li, Qiuchi
    [J]. Neural Networks, 2025, 184
  • [4] SPIQ: A Self-Supervised Pre-Trained Model for Image Quality Assessment
    Chen, Pengfei
    Li, Leida
    Wu, Qingbo
    Wu, Jinjian
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 513 - 517
  • [5] Dual Modality Prompt Tuning for Vision-Language Pre-Trained Model
    Xing, Yinghui
    Wu, Qirui
    Cheng, De
    Zhang, Shizhou
    Liang, Guoqiang
    Wang, Peng
    Zhang, Yanning
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 2056 - 2068
  • [6] Enhancing Pre-trained Language Models by Self-supervised Learning for Story Cloze Test
    Xie, Yuqiang
    Hu, Yue
    Xing, Luxi
    Wang, Chunhui
    Hu, Yong
    Wei, Xiangpeng
    Sun, Yajing
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT (KSEM 2020), PT I, 2020, 12274 : 271 - 279
  • [7] DictPrompt: Comprehensive dictionary-integrated prompt tuning for pre-trained language model
    Cao, Rui
    Wang, Yihao
    Gao, Ling
    Yang, Meng
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 273
  • [8] Speech Enhancement Using Self-Supervised Pre-Trained Model and Vector Quantization
    Zhao, Xiao-Ying
    Zhu, Qiu-Shi
    Zhang, Jie
    [J]. PROCEEDINGS OF 2022 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2022, : 330 - 334
  • [9] CPT: Colorful Prompt Tuning for pre-trained vision-language models
    Yao, Yuan
    Zhang, Ao
    Zhang, Zhengyan
    Liu, Zhiyuan
    Chua, Tat-Seng
    Sun, Maosong
    [J]. AI OPEN, 2024, 5 : 30 - 38
  • [10] Self-Supervised Quantization of Pre-Trained Neural Networks for Multiplierless Acceleration
    Vogel, Sebastian
    Springer, Jannik
    Guntoro, Andre
    Ascheid, Gerd
    [J]. 2019 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2019, : 1094 - 1099