Relational Prompt-Based Pre-Trained Language Models for Social Event Detection

被引:0
|
作者
Li, Pu [1 ]
Yu, Xiaoyan [2 ]
Peng, Hao [3 ]
Xian, Yantuan [1 ]
Wang, Linqin [1 ]
Sun, Li [4 ]
Zhang, Jingyun [3 ]
Yu, Philip S. [5 ]
机构
[1] Kunming University of Science and Technology, Kunming, China
[2] Beijing Institute of Technology, Beijing, China
[3] Beihang University, Beijing, China
[4] North China Electric Power University, Beijing, China
[5] University of Illinois at Chicago, Chicago,IL, United States
关键词
Cluster analysis - Contrastive Learning - Economic and social effects - Graph embeddings - Graph neural networks - Network embeddings - Public risks - Risk analysis - Risk assessment;
D O I
10.1145/3695869
中图分类号
学科分类号
摘要
Social Event Detection (SED) aims to identify significant events from social streams, and has a wide application ranging from public opinion analysis to risk management. In recent years, Graph Neural Network (GNN) based solutions have achieved state-of-the-art performance. However, GNN-based methods often struggle with missing and noisy edges between messages, affecting the quality of learned message embedding. Moreover, these methods statically initialize node embedding before training, which, in turn, limits the ability to learn from message texts and relations simultaneously. In this article, we approach social event detection from a new perspective based on Pre-trained Language Models (PLMs), and present (Relational prompt-based Pre-trained Language Models for Social Event Detection). We first propose a new pairwise message modeling strategy to construct social messages into message pairs with multi-relational sequences. Secondly, a new multi-relational prompt-based pairwise message learning mechanism is proposed to learn more comprehensive message representation from message pairs with multi-relational prompts using PLMs. Thirdly, we design a new clustering constraint to optimize the encoding process by enhancing intra-cluster compactness and inter-cluster dispersion, making the message representation more distinguishable. We evaluate the on three real-world datasets, demonstrating that the model achieves state-of-the-art performance in offline, online, low-resource, and long-tail distribution scenarios for social event detection tasks. © 2024 Copyright held by the owner/author(s). Publication rights licensed to ACM.
引用
收藏
相关论文
共 50 条
  • [1] The Biases of Pre-Trained Language Models: An Empirical Study on Prompt-Based Sentiment Analysis and Emotion Detection
    Mao, Rui
    Liu, Qian
    He, Kai
    Li, Wei
    Cambria, Erik
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) : 1743 - 1753
  • [2] Prompt Tuning for Discriminative Pre-trained Language Models
    Yao, Yuan
    Dong, Bowen
    Zhang, Ao
    Zhang, Zhengyan
    Xie, Ruobing
    Liu, Zhiyuan
    Lin, Leyu
    Sun, Maosong
    Wang, Jianyong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3468 - 3473
  • [3] Prompt-based Pre-trained Model for Personality and Interpersonal Reactivity Prediction
    Li, Bin
    Weng, Yixuan
    Song, Qiya
    Ma, Fuyan
    Sun, Bin
    Li, Shutao
    PROCEEDINGS OF THE 12TH WORKSHOP ON COMPUTATIONAL APPROACHES TO SUBJECTIVITY, SENTIMENT & SOCIAL MEDIA ANALYSIS, 2022, : 265 - 270
  • [4] On Robustness of Prompt-based Semantic Parsing with Large Pre-trained Language Model: An Empirical Study on Codex
    Zhuo, Terry Yue
    Li, Zhuang
    Huang, Yujin
    Shiri, Fatemeh
    Wang, Weiqing
    Haffari, Gholamreza
    Li, Yuan-Fang
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1090 - 1102
  • [5] BEYOND SIMPLE TEXT STYLE TRANSFER: UNVEILING COMPOUND TEXT STYLE TRANSFER WITH PROMPT-BASED PRE-TRAINED LANGUAGE MODELS
    Ju, Shuai
    Wang, Chenxu
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 6850 - 6854
  • [6] Data Augmentation Based on Pre-trained Language Model for Event Detection
    Zhang, Meng
    Xie, Zhiwen
    Liu, Jin
    CCKS 2021 - EVALUATION TRACK, 2022, 1553 : 59 - 68
  • [7] KG-prompt: Interpretable knowledge graph prompt for pre-trained language models
    Chen, Liyi
    Liu, Jie
    Duan, Yutai
    Wang, Runze
    KNOWLEDGE-BASED SYSTEMS, 2025, 311
  • [8] Exploring Pre-trained Language Models for Event Extraction and Generation
    Yang, Sen
    Feng, Dawei
    Qiao, Linbo
    Kan, Zhigang
    Li, Dongsheng
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5284 - 5294
  • [9] NtNDet: Hardware Trojan detection based on pre-trained language models
    Kuang, Shijie
    Quan, Zhe
    Xie, Guoqi
    Cai, Xiaomin
    Li, Keqin
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 271
  • [10] APrompt: Attention Prompt Tuning for Efficient Adaptation of Pre-trained Language Models
    Wang, Qifan
    Mao, Yuning
    Wang, Jingang
    Yu, Hanchao
    Li, Shaoliang
    Wang, Sinong
    Feng, Fuli
    Huang, Lifu
    Quan, Xiaojun
    Xu, Zenglin
    Liu, Dongfang
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 9147 - 9160