Enhancing rumor detection with data augmentation and generative pre-trained transformer

被引:1
|
作者
Askarizade, Mojgan [1 ]
机构
[1] Ardakan Univ, Fac Engn, Dept Comp Engn, Ardakan, Yazd, Iran
关键词
Fake news detection; Finetuned language model; Neural network classifier; Rumor detection; Generative pre-trained transformer; Data augmentation;
D O I
10.1016/j.eswa.2024.125649
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The advent of social networks has facilitated the rapid dissemination of false information, including rumors, leading to significant societal and individual damages. Extensive research has been dedicated to rumor detection, ranging from machine learning techniques to neural networks. However, the existing methods could not learn the deep concepts of the rumor text to detect the rumor. In addition, imbalanced datasets in the rumor domain reduce the effectiveness of these algorithms. This study addresses this challenge by leveraging the Generative Pre-trained Transformer 2 (GPT-2) model to generate rumor-like texts, thus creating a balanced dataset. Subsequently, a novel approach for classifying rumor texts is proposed by modifying the GPT-2 model. We compare our results with state-of-art machine learning and deep learning methods as well as pretrained models on the PHEME, Twitter15, and Twitter16 datasets. Our findings demonstrate that the proposed model, implementing advanced artificial intelligence techniques, has improved accuracy and F-measure in the application of detecting rumors compared to previous methods.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] HiVeGPT: Human-Machine-Augmented Intelligent Vehicles With Generative Pre-Trained Transformer
    Zhang, Junping
    Pu, Jian
    Xue, Jianru
    Yang, Ming
    Xu, Xin
    Wang, Xiao
    Wang, Fei-Yue
    IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2023, 8 (03): : 2027 - 2033
  • [32] Universal skepticism of ChatGPT: a review of early literature on chat generative pre-trained transformer
    Watters, Casey
    Lemanski, Michal K.
    FRONTIERS IN BIG DATA, 2023, 6
  • [33] Chatbots Attempt Physics Homework-ChatGPT: Chat Generative Pre-Trained Transformer
    MacIsaac, Dan
    PHYSICS TEACHER, 2023, 61 (04): : 318 - 318
  • [34] Using the Chat Generative Pre-trained Transformer in academic health writing: a scoping review
    Pinto Costa, Isabelle Cristinne
    do Nascimento, Murilo Cesar
    Treviso, Patricia
    Chini, Lucelia Terra
    Roza, Bartira de Aguiar
    Barbosa, Sayonara De Fatima Faria
    Mendes, Karina Dal Sasso
    REVISTA LATINO-AMERICANA DE ENFERMAGEM, 2024, 32
  • [35] Using the Chat Generative Pre-trained Transformer in academic writing in health: a scoping review
    Costa, Isabelle Cristinne Pinto
    do Nascimento, Murilo Cesar
    Treviso, Patricia
    Chini, Lucelia Terra
    Roza, Bartira de Aguiar
    Barbosa, Sayonara De Fatima Faria
    Mendes, Karina Dal Sasso
    REVISTA LATINO-AMERICANA DE ENFERMAGEM, 2024, 32
  • [36] AlarmGPT: an intelligent alarm analyzer for optical networks using a generative pre-trained transformer
    Wang, Yidi
    Zhang, Chunyu
    Li, Jin
    Pang, Yue
    Zhang, Lifang
    Zhang, Min
    Wang, Danshi
    JOURNAL OF OPTICAL COMMUNICATIONS AND NETWORKING, 2024, 16 (06) : 681 - 694
  • [37] Technological Advancements in Menstrual Health: The Role of Generative Pre-Trained Transformer and Bees Algorithm
    Irene, D. Shiny
    Priyadharshini, S. Indra
    Ponnuviji, N. P.
    Kalaivani, A.
    IETE JOURNAL OF RESEARCH, 2024, 70 (12) : 8476 - 8491
  • [38] Role of chat-generative pre-trained transformer (ChatGPT) in anaesthesia: Merits and pitfalls
    Reddy, Ashwini
    Patel, Swati
    Barik, Amiya Kumar
    Gowda, Punith
    INDIAN JOURNAL OF ANAESTHESIA, 2023, 67 (10) : 942 - 944
  • [40] Evolutionary Game Analysis of Artificial Intelligence Such as the Generative Pre-Trained Transformer in Future Education
    You, Yanwei
    Chen, Yuquan
    You, Yujun
    Zhang, Qi
    Cao, Qiang
    SUSTAINABILITY, 2023, 15 (12)