Empowering crisis information extraction through actionability event schemata and domain-adaptive pre-training

被引:0
|
作者
Zhang, Yuhao [1 ]
Lo, Siaw Ling [1 ]
Myint, Phyo Yi Win [1 ]
机构
[1] Singapore Management University, Singapore, Singapore
来源
Information and Management | 2025年 / 62卷 / 01期
关键词
Adversarial machine learning;
D O I
10.1016/j.im.2024.104065
中图分类号
学科分类号
摘要
One of the persistent challenges in crisis detection is inferring actionable information to support emergency response. Existing methods focus on situational awareness but often lack actionable insights. This study proposes a holistic approach to implementing an actionability extraction system on social media, including requirement gathering, selection of machine learning tasks, data preparation, and integration with existing resources, providing guidance for governments, civil services, emergency workers, and researchers on supplementing existing channels with actionable information from social media. Our solution leverages an actionability schema and domain-adaptive pre-training, improving upon the state-of-the-art model by 5.5 % and 10.1 % in micro and macro F1 scores. © 2024 Elsevier B.V.
引用
下载
收藏
相关论文
共 18 条
  • [1] A Domain-adaptive Pre-training Approach for Language Bias Detection in News
    Krieger, Jan-David
    Spinde, Timo
    Ruas, Terry
    Kulshrestha, Juhi
    Gipp, Bela
    2022 ACM/IEEE JOINT CONFERENCE ON DIGITAL LIBRARIES (JCDL), 2022,
  • [2] Domain-adaptive pre-training on a BERT model for the automatic detection of misogynistic tweets in Spanish
    Rodriguez, Dalia A.
    Diaz-Escobar, Julia
    Diaz-Ramirez, Arnoldo
    Trujillo, Leonardo
    SOCIAL NETWORK ANALYSIS AND MINING, 2023, 13 (01)
  • [3] Domain-adaptive pre-training on a BERT model for the automatic detection of misogynistic tweets in Spanish
    Dalia A. Rodríguez
    Julia Diaz-Escobar
    Arnoldo Díaz-Ramírez
    Leonardo Trujillo
    Social Network Analysis and Mining, 13
  • [4] Diagnosing crop diseases based on domain-adaptive pre-training BERT of electronic medical records
    Ding, Junqi
    Li, Bo
    Xu, Chang
    Qiao, Yan
    Zhang, Lingxian
    APPLIED INTELLIGENCE, 2023, 53 (12) : 15979 - 15992
  • [5] K-DLM: A Domain-Adaptive Language Model Pre-Training Framework with Knowledge Graph
    Zou, Jiaxin
    Xie, Zuotong
    Chen, Junhua
    Hou, Jiawei
    Yan, Qiang
    Zheng, Hai-Tao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT IV, 2023, 14257 : 447 - 459
  • [6] Diagnosing crop diseases based on domain-adaptive pre-training BERT of electronic medical records
    Junqi Ding
    Bo Li
    Chang Xu
    Yan Qiao
    Lingxian Zhang
    Applied Intelligence, 2023, 53 : 15979 - 15992
  • [7] GeoLayoutLM: Geometric Pre-training for Visual Information Extraction
    Luo, Chuwei
    Cheng, Changxu
    Zheng, Qi
    Yao, Cong
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7092 - 7101
  • [8] Subset selection for domain adaptive pre-training of language model
    JunHa Hwang
    SeungDong Lee
    HaNeul Kim
    Young-Seob Jeong
    Scientific Reports, 15 (1)
  • [9] Improving Information Extraction on Business Documents with Specific Pre-training Tasks
    Douzon, Thibault
    Duffner, Stefan
    Garcia, Christophe
    Espinas, Jeremy
    DOCUMENT ANALYSIS SYSTEMS, DAS 2022, 2022, 13237 : 111 - 125
  • [10] DILBERT: Customized Pre-Training for Domain Adaptation with Category Shift, with an Application to Aspect Extraction
    Lekhtman, Entony
    Ziser, Yftah
    Reichart, Roi
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 219 - 230