Using Recurrent Neural Network Structure and Multi-Head Attention with Convolution for Fraudulent Phone Text Recognition

被引:0
|
作者
Zhou J. [1 ]
Xu H. [1 ]
Zhang Z. [1 ]
Lu J. [1 ]
Guo W. [1 ]
Li Z. [1 ]
机构
[1] School of Information and Electrical Engineering, Shandong Jianzhu University, Jinan
来源
关键词
BiGRU; BiLSTM; CNN; multi-head attention mechanism;
D O I
10.32604/csse.2023.036419
中图分类号
学科分类号
摘要
Fraud cases have been a risk in society and people’s property security has been greatly threatened. In recent studies, many promising algorithms have been developed for social media offensive text recognition as well as sentiment analysis. These algorithms are also suitable for fraudulent phone text recognition. Compared to these tasks, the semantics of fraudulent words are more complex and more difficult to distinguish. Recurrent Neural Networks (RNN), the variants of RNN, Convolutional Neural Networks (CNN), and hybrid neural networks to extract text features are used by most text classification research. However, a single network or a simple network combination cannot obtain rich characteristic knowledge of fraudulent phone texts relatively. Therefore, a new model is proposed in this paper. In the fraudulent phone text, the knowledge that can be learned by the model includes the sequence structure of sentences, the correlation between words, the correlation of contextual semantics, the feature of keywords in sentences, etc. The new model combines a bidirectional Long-Short Term Memory Neural Network (BiLSTM) or a bidirectional Gate Recurrent United (BiGRU) and a Multi-Head attention mechanism module with convolution. A normalization layer is added after the output of the final hidden layer. BiLSTM or BiGRU is used to build the encoding and decoding layer. Multi-head attention mechanism module with convolution (MHAC) enhances the ability of the model to learn global interaction information and multi-granularity local interaction information in fraudulent sentences. A fraudulent phone text dataset is produced by us in this paper. The THUCNews data sets and fraudulent phone text data sets are used in experiments. Experiment results show that compared with the baseline model, the proposed model (LMHACL) has the best experiment results in terms of Accuracy, Precision, Recall, and F1 score on the two data sets. And the performance indexes on fraudulent phone text data sets are all above 0.94. © 2023 CRL Publishing. All rights reserved.
引用
收藏
页码:2277 / 2297
页数:20
相关论文
共 50 条
  • [1] Recurrent multi-head attention fusion network for combining audio and text for speech emotion recognition
    Ahn, Chung-Soo
    Kasun, L. L. Chamara
    Sivadas, Sunil
    Rajapakse, Jagath C.
    INTERSPEECH 2022, 2022, : 744 - 748
  • [2] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Xue-Liang Leng
    Xiao-Ai Miao
    Tao Liu
    Multimedia Tools and Applications, 2021, 80 : 12581 - 12600
  • [3] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Leng, Xue-Liang
    Miao, Xiao-Ai
    Liu, Tao
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (08) : 12581 - 12600
  • [4] EEG-Based Emotion Recognition Using Convolutional Recurrent Neural Network with Multi-Head Self-Attention
    Hu, Zhangfang
    Chen, Libujie
    Luo, Yuan
    Zhou, Jingfan
    APPLIED SCIENCES-BASEL, 2022, 12 (21):
  • [5] Attention induced multi-head convolutional neural network for human activity recognition
    Khan, Zanobya N.
    Ahmad, Jamil
    APPLIED SOFT COMPUTING, 2021, 110
  • [6] Automatic scene generation using sentiment analysis and bidirectional recurrent neural network with multi-head attention
    Dharaniya, R.
    Indumathi, J.
    Uma, G. V.
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (19): : 16945 - 16958
  • [7] Automatic scene generation using sentiment analysis and bidirectional recurrent neural network with multi-head attention
    R. Dharaniya
    J. Indumathi
    G. V. Uma
    Neural Computing and Applications, 2022, 34 : 16945 - 16958
  • [8] Bidirectional recurrent neural network with multi-head attention for automatic scene generation using sentiment analysis
    Dharaniya, R.
    Indumathi, J.
    Uma, G., V
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 43 (06) : 7023 - 7039
  • [9] Multimodal Approach of Speech Emotion Recognition Using Multi-Level Multi-Head Fusion Attention-Based Recurrent Neural Network
    Ngoc-Huynh Ho
    Yang, Hyung-Jeong
    Kim, Soo-Hyung
    Lee, Gueesang
    IEEE ACCESS, 2020, 8 : 61672 - 61686
  • [10] Speech Emotion Recognition Using Convolution Neural Networks and Multi-Head Convolutional Transformer
    Ullah, Rizwan
    Asif, Muhammad
    Shah, Wahab Ali
    Anjam, Fakhar
    Ullah, Ibrar
    Khurshaid, Tahir
    Wuttisittikulkij, Lunchakorn
    Shah, Shashi
    Ali, Syed Mansoor
    Alibakhshikenari, Mohammad
    SENSORS, 2023, 23 (13)