Using Recurrent Neural Network Structure and Multi-Head Attention with Convolution for Fraudulent Phone Text Recognition

被引:0
|
作者
Zhou J. [1 ]
Xu H. [1 ]
Zhang Z. [1 ]
Lu J. [1 ]
Guo W. [1 ]
Li Z. [1 ]
机构
[1] School of Information and Electrical Engineering, Shandong Jianzhu University, Jinan
来源
关键词
BiGRU; BiLSTM; CNN; multi-head attention mechanism;
D O I
10.32604/csse.2023.036419
中图分类号
学科分类号
摘要
Fraud cases have been a risk in society and people’s property security has been greatly threatened. In recent studies, many promising algorithms have been developed for social media offensive text recognition as well as sentiment analysis. These algorithms are also suitable for fraudulent phone text recognition. Compared to these tasks, the semantics of fraudulent words are more complex and more difficult to distinguish. Recurrent Neural Networks (RNN), the variants of RNN, Convolutional Neural Networks (CNN), and hybrid neural networks to extract text features are used by most text classification research. However, a single network or a simple network combination cannot obtain rich characteristic knowledge of fraudulent phone texts relatively. Therefore, a new model is proposed in this paper. In the fraudulent phone text, the knowledge that can be learned by the model includes the sequence structure of sentences, the correlation between words, the correlation of contextual semantics, the feature of keywords in sentences, etc. The new model combines a bidirectional Long-Short Term Memory Neural Network (BiLSTM) or a bidirectional Gate Recurrent United (BiGRU) and a Multi-Head attention mechanism module with convolution. A normalization layer is added after the output of the final hidden layer. BiLSTM or BiGRU is used to build the encoding and decoding layer. Multi-head attention mechanism module with convolution (MHAC) enhances the ability of the model to learn global interaction information and multi-granularity local interaction information in fraudulent sentences. A fraudulent phone text dataset is produced by us in this paper. The THUCNews data sets and fraudulent phone text data sets are used in experiments. Experiment results show that compared with the baseline model, the proposed model (LMHACL) has the best experiment results in terms of Accuracy, Precision, Recall, and F1 score on the two data sets. And the performance indexes on fraudulent phone text data sets are all above 0.94. © 2023 CRL Publishing. All rights reserved.
引用
收藏
页码:2277 / 2297
页数:20
相关论文
共 50 条
  • [31] EfficientNet and multi-path convolution with multi-head attention network for brain tumor grade classification
    Isunuri, B. Venkateswarlu
    Kakarla, Jagadeesh
    COMPUTERS & ELECTRICAL ENGINEERING, 2023, 108
  • [32] Multi-Scale Convolution Attention Neural Network for Gesture Recognition
    Ji, Penghui
    Cao, Chongli
    Zhang, Hang
    Li, Qi
    PROCEEDINGS OF 2024 3RD INTERNATIONAL CONFERENCE ON CRYPTOGRAPHY, NETWORK SECURITY AND COMMUNICATION TECHNOLOGY, CNSCT 2024, 2024, : 421 - 425
  • [33] Text summarization based on multi-head self-attention mechanism and pointer network
    Dong Qiu
    Bing Yang
    Complex & Intelligent Systems, 2022, 8 : 555 - 567
  • [34] Interactive Selection Recommendation Based on the Multi-head Attention Graph Neural Network
    Zhang, Shuxi
    Chen, Jianxia
    Yao, Meihan
    Wu, Xinyun
    Ge, Yvfan
    Li, Shu
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT III, 2024, 14449 : 447 - 458
  • [35] Text summarization based on multi-head self-attention mechanism and pointer network
    Qiu, Dong
    Yang, Bing
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (01) : 555 - 567
  • [36] Cascade multi-head attention networks for action recognition
    Wang, Jiaze
    Peng, Xiaojiang
    Qiao, Yu
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2020, 192
  • [37] An adaptive multi-head self-attention coupled with attention filtered LSTM for advanced scene text recognition
    Selvam, Prabu
    Kumar, S. N.
    Kannadhasan, S.
    INTERNATIONAL JOURNAL ON DOCUMENT ANALYSIS AND RECOGNITION, 2025,
  • [38] Improving CRNN with EfficientNet-like feature extractor and multi-head attention for text recognition
    Dinh Viet Sang
    Le Tran Bao Cuong
    SOICT 2019: PROCEEDINGS OF THE TENTH INTERNATIONAL SYMPOSIUM ON INFORMATION AND COMMUNICATION TECHNOLOGY, 2019, : 285 - 290
  • [39] Fast Neural Chinese Named Entity Recognition with Multi-head Self-attention
    Qi, Tao
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Liu, Junxin
    Huang, Yongfeng
    Xie, Xing
    KNOWLEDGE GRAPH AND SEMANTIC COMPUTING: KNOWLEDGE COMPUTING AND LANGUAGE UNDERSTANDING, 2019, 1134 : 98 - 110
  • [40] Exploring Implicit Biological Heterogeneity in ASD Diagnosis Using a Multi-Head Attention Graph Neural Network
    Moon, Hyung-Jun
    Cho, Sung-Bae
    JOURNAL OF INTEGRATIVE NEUROSCIENCE, 2024, 23 (07)