Using Recurrent Neural Network Structure and Multi-Head Attention with Convolution for Fraudulent Phone Text Recognition

被引:0
|
作者
Zhou J. [1 ]
Xu H. [1 ]
Zhang Z. [1 ]
Lu J. [1 ]
Guo W. [1 ]
Li Z. [1 ]
机构
[1] School of Information and Electrical Engineering, Shandong Jianzhu University, Jinan
来源
关键词
BiGRU; BiLSTM; CNN; multi-head attention mechanism;
D O I
10.32604/csse.2023.036419
中图分类号
学科分类号
摘要
Fraud cases have been a risk in society and people’s property security has been greatly threatened. In recent studies, many promising algorithms have been developed for social media offensive text recognition as well as sentiment analysis. These algorithms are also suitable for fraudulent phone text recognition. Compared to these tasks, the semantics of fraudulent words are more complex and more difficult to distinguish. Recurrent Neural Networks (RNN), the variants of RNN, Convolutional Neural Networks (CNN), and hybrid neural networks to extract text features are used by most text classification research. However, a single network or a simple network combination cannot obtain rich characteristic knowledge of fraudulent phone texts relatively. Therefore, a new model is proposed in this paper. In the fraudulent phone text, the knowledge that can be learned by the model includes the sequence structure of sentences, the correlation between words, the correlation of contextual semantics, the feature of keywords in sentences, etc. The new model combines a bidirectional Long-Short Term Memory Neural Network (BiLSTM) or a bidirectional Gate Recurrent United (BiGRU) and a Multi-Head attention mechanism module with convolution. A normalization layer is added after the output of the final hidden layer. BiLSTM or BiGRU is used to build the encoding and decoding layer. Multi-head attention mechanism module with convolution (MHAC) enhances the ability of the model to learn global interaction information and multi-granularity local interaction information in fraudulent sentences. A fraudulent phone text dataset is produced by us in this paper. The THUCNews data sets and fraudulent phone text data sets are used in experiments. Experiment results show that compared with the baseline model, the proposed model (LMHACL) has the best experiment results in terms of Accuracy, Precision, Recall, and F1 score on the two data sets. And the performance indexes on fraudulent phone text data sets are all above 0.94. © 2023 CRL Publishing. All rights reserved.
引用
收藏
页码:2277 / 2297
页数:20
相关论文
共 50 条
  • [21] Bilinear Multi-Head Attention Graph Neural Network for Traffic Prediction
    Hu, Haibing
    Han, Kai
    Yin, Zhizhuo
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 2, 2022, : 33 - 43
  • [22] Distract Your Attention: Multi-Head Cross Attention Network for Facial Expression Recognition
    Wen, Zhengyao
    Lin, Wenzhong
    Wang, Tao
    Xu, Ge
    BIOMIMETICS, 2023, 8 (02)
  • [23] Augmented Convolutional Neural Network Models with Relative Multi-Head Attention for Target Recognition in Infrared Images
    Nebili, Billel
    Khellal, Atmane
    Nemra, Abdelkrim
    Mascarilla, Laurent
    UNMANNED SYSTEMS, 2023, 11 (03) : 221 - 230
  • [24] Multi-Head Attention Neural Network for Smartphone Invariant Indoor Localization
    Tiku, Saideep
    Gufran, Danish
    Pasricha, Sudeep
    2022 IEEE 12TH INTERNATIONAL CONFERENCE ON INDOOR POSITIONING AND INDOOR NAVIGATION (IPIN 2022), 2022,
  • [25] Point Cloud Upsampling Network Incorporating Dynamic Graph Convolution and Multi-Head Attention
    Yang, Xiaoping
    Chen, Fei
    Li, Zhenhua
    Liu, Guanghui
    INFORMATION TECHNOLOGY AND CONTROL, 2024, 53 (04):
  • [26] DDNet: a hybrid network based on deep adaptive multi-head attention and dynamic graph convolution for EEG emotion recognition
    Xu, Bingyue
    Zhang, Xin
    Zhang, Xiu
    Sun, Baiwei
    Wang, Yujie
    SIGNAL IMAGE AND VIDEO PROCESSING, 2025, 19 (04)
  • [27] Building pattern recognition by using an edge-attention multi-head graph convolutional network
    Wang, Haitao
    Xu, Yongyang
    Hu, Anna
    Xie, Xuejing
    Chen, Siqiong
    Xie, Zhong
    INTERNATIONAL JOURNAL OF GEOGRAPHICAL INFORMATION SCIENCE, 2025, 39 (04) : 732 - 757
  • [28] Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
    Xue, Yafei
    Zhu, Jing
    Lyu, Jing
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [29] Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
    Xue, Yafei
    Zhu, Jing
    Lyu, Jing
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [30] Multi-Head Attention Affinity Diversity Sharing Network for Facial Expression Recognition
    Zheng, Caixia
    Liu, Jiayu
    Zhao, Wei
    Ge, Yingying
    Chen, Wenhe
    ELECTRONICS, 2024, 13 (22)