Enhancing rumor detection with data augmentation and generative pre-trained transformer

被引:1
|
作者
Askarizade, Mojgan [1 ]
机构
[1] Ardakan Univ, Fac Engn, Dept Comp Engn, Ardakan, Yazd, Iran
关键词
Fake news detection; Finetuned language model; Neural network classifier; Rumor detection; Generative pre-trained transformer; Data augmentation;
D O I
10.1016/j.eswa.2024.125649
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The advent of social networks has facilitated the rapid dissemination of false information, including rumors, leading to significant societal and individual damages. Extensive research has been dedicated to rumor detection, ranging from machine learning techniques to neural networks. However, the existing methods could not learn the deep concepts of the rumor text to detect the rumor. In addition, imbalanced datasets in the rumor domain reduce the effectiveness of these algorithms. This study addresses this challenge by leveraging the Generative Pre-trained Transformer 2 (GPT-2) model to generate rumor-like texts, thus creating a balanced dataset. Subsequently, a novel approach for classifying rumor texts is proposed by modifying the GPT-2 model. We compare our results with state-of-art machine learning and deep learning methods as well as pretrained models on the PHEME, Twitter15, and Twitter16 datasets. Our findings demonstrate that the proposed model, implementing advanced artificial intelligence techniques, has improved accuracy and F-measure in the application of detecting rumors compared to previous methods.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Chat Generative Pre-trained Transformer: why we should embrace this technology
    Chavez, Martin R.
    Butler, Thomas S.
    Rekawek, Patricia
    Heo, Hye
    Kinzler, Wendy L.
    AMERICAN JOURNAL OF OBSTETRICS AND GYNECOLOGY, 2023, 228 (06) : 706 - 711
  • [22] The utility of Chat Generative Pre-trained Transformer as a patient resource in paediatric otolaryngology
    Jongbloed, Walter M.
    Grover, Nancy
    JOURNAL OF LARYNGOLOGY AND OTOLOGY, 2024,
  • [23] Generative pre-trained transformer (GPT)-4 support for differential diagnosis in neuroradiology
    Sorin, Vera
    Klang, Eyal
    Sobeh, Tamer
    Konen, Eli
    Shrot, Shai
    Livne, Adva
    Weissbuch, Yulian
    Hoffmann, Chen
    Barash, Yiftach
    QUANTITATIVE IMAGING IN MEDICINE AND SURGERY, 2024, 14 (10)
  • [24] Quantitative Advancements in Clinical Accuracy of Successive Generative Pre-Trained Transformer Models
    Tate, Hudson
    Hambright, Ben
    Clark, Abby
    Dixon, Cory
    Kronz, Ben
    Ricks, James
    Spaedy, Olivia
    Whalen, Sydney
    Butler, Danner
    Bicknell, Brenton
    JOURNAL OF INVESTIGATIVE MEDICINE, 2024, 72 (06)
  • [25] Generative Pre-trained Transformer 4 (GPT-4) in clinical settings
    Bellini, Valentina
    Bignami, Elena Giovanna
    LANCET DIGITAL HEALTH, 2025, 7 (01): : e6 - e7
  • [26] Chat generative pre-trained transformer (ChatGPT): potential implications for rheumatology practice
    Nune, Arvind
    Iyengar, Karthikeyan. P.
    Manzo, Ciro
    Barman, Bhupen
    Botchu, Rajesh
    RHEUMATOLOGY INTERNATIONAL, 2023, 43 (07) : 1379 - 1380
  • [27] Potential applications of Chat Generative Pre-trained Transformer in obstetrics and gynecology: comment
    Daungsupawong, Hinpetch
    Wiwanitkit, Viroj
    OBSTETRICS & GYNECOLOGY SCIENCE, 2024, 67 (03) : 341 - 342
  • [28] Pre-Trained Image Processing Transformer
    Chen, Hanting
    Wang, Yunhe
    Guo, Tianyu
    Xu, Chang
    Deng, Yiping
    Liu, Zhenhua
    Ma, Siwei
    Xu, Chunjing
    Xu, Chao
    Gao, Wen
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 12294 - 12305
  • [29] DFEPT: Data Flow Embedding for Enhancing Pre-Trained Model Based Vulnerability Detection
    Jiang, Zhonghao
    Sun, Weifeng
    Gu, Xiaoyan
    Wu, Jiaxin
    Wen, Tao
    Hu, Haibo
    Yan, Meng
    PROCEEDINGS OF THE 15TH ASIA-PACIFIC SYMPOSIUM ON INTERNETWARE, INTERNETWARE 2024, 2024, : 95 - 104
  • [30] Towards JavaScript program repair with Generative Pre-trained Transformer (GPT-2)
    Lajko, Mark
    Csuvik, Viktor
    Vidacs, Laszlo
    Proceedings - International Workshop on Automated Program Repair, APR 2022, 2022, : 61 - 68