Using Recurrent Neural Network Structure and Multi-Head Attention with Convolution for Fraudulent Phone Text Recognition

被引:0
|
作者
Zhou J. [1 ]
Xu H. [1 ]
Zhang Z. [1 ]
Lu J. [1 ]
Guo W. [1 ]
Li Z. [1 ]
机构
[1] School of Information and Electrical Engineering, Shandong Jianzhu University, Jinan
来源
关键词
BiGRU; BiLSTM; CNN; multi-head attention mechanism;
D O I
10.32604/csse.2023.036419
中图分类号
学科分类号
摘要
Fraud cases have been a risk in society and people’s property security has been greatly threatened. In recent studies, many promising algorithms have been developed for social media offensive text recognition as well as sentiment analysis. These algorithms are also suitable for fraudulent phone text recognition. Compared to these tasks, the semantics of fraudulent words are more complex and more difficult to distinguish. Recurrent Neural Networks (RNN), the variants of RNN, Convolutional Neural Networks (CNN), and hybrid neural networks to extract text features are used by most text classification research. However, a single network or a simple network combination cannot obtain rich characteristic knowledge of fraudulent phone texts relatively. Therefore, a new model is proposed in this paper. In the fraudulent phone text, the knowledge that can be learned by the model includes the sequence structure of sentences, the correlation between words, the correlation of contextual semantics, the feature of keywords in sentences, etc. The new model combines a bidirectional Long-Short Term Memory Neural Network (BiLSTM) or a bidirectional Gate Recurrent United (BiGRU) and a Multi-Head attention mechanism module with convolution. A normalization layer is added after the output of the final hidden layer. BiLSTM or BiGRU is used to build the encoding and decoding layer. Multi-head attention mechanism module with convolution (MHAC) enhances the ability of the model to learn global interaction information and multi-granularity local interaction information in fraudulent sentences. A fraudulent phone text dataset is produced by us in this paper. The THUCNews data sets and fraudulent phone text data sets are used in experiments. Experiment results show that compared with the baseline model, the proposed model (LMHACL) has the best experiment results in terms of Accuracy, Precision, Recall, and F1 score on the two data sets. And the performance indexes on fraudulent phone text data sets are all above 0.94. © 2023 CRL Publishing. All rights reserved.
引用
收藏
页码:2277 / 2297
页数:20
相关论文
共 50 条
  • [41] Fraudulent phone call recognition method based on convolutional neural network
    邢剑
    Wang Shupeng
    Ding Yu
    HighTechnologyLetters, 2020, 26 (04) : 367 - 371
  • [42] A Multi-Head Convolutional Neural Network with Multi-Path Attention Improves Image Denoising
    Zhang, Jiahong
    Qu, Meijun
    Wang, Ye
    Cao, Lihong
    PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT III, 2022, 13631 : 338 - 351
  • [43] An artificial neural network using multi-head intermolecular attention for predicting chemical reactivity of organic materials
    Yoo, Jaekyun
    Kim, Byunghoon
    Lee, Byungju
    Song, Jun-hyuk
    Kang, Kisuk
    JOURNAL OF MATERIALS CHEMISTRY A, 2023, 11 (24) : 12784 - 12792
  • [44] Prediction of Lithium Battery Voltage and State of Charge Using Multi-Head Attention BiLSTM Neural Network
    Xi, Haiwen
    Lv, Taolin
    Qin, Jincheng
    Ma, Mingsheng
    Xie, Jingying
    Lu, Shigang
    Liu, Zhifu
    APPLIED SCIENCES-BASEL, 2025, 15 (06):
  • [45] Fraudulent phone call recognition method based on convolutional neural network
    Xing J.
    Wang S.
    Ding Y.
    High Technology Letters, 2020, 26 (04) : 367 - 371
  • [46] Classification Algorithm for Electroencephalogram-based Motor Imagery Using Hybrid Neural Network with Spatio-temporal Convolution and Multi-head Attention Mechanism
    Shi, Xingbin
    Li, Baojiang
    Qin, Wenlong Wang Yuxin
    Wang, Haiyan
    Wang, Xichao
    NEUROSCIENCE, 2023, 527 : 64 - 73
  • [47] Temporal Residual Network Based Multi-Head Attention Model for Arabic Handwriting Recognition
    Zouari, Ramzi
    Othmen, Dalila
    Boubaker, Houcine
    Kherallah, Monji
    INTERNATIONAL ARAB JOURNAL OF INFORMATION TECHNOLOGY, 2023, 20 (3A) : 469 - 476
  • [48] A facial depression recognition method based on hybrid multi-head cross attention network
    Li, Yutong
    Liu, Zhenyu
    Zhou, Li
    Yuan, Xiaoyan
    Shangguan, Zixuan
    Hu, Xiping
    Hu, Bin
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [49] A Supervised Multi-Head Self-Attention Network for Nested Named Entity Recognition
    Xu, Yongxiu
    Huang, Heyan
    Feng, Chong
    Hu, Yue
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14185 - 14193
  • [50] Research on Transportation Mode Recognition Based on Multi-Head Attention Temporal Convolutional Network
    Cheng, Shuyu
    Liu, Yingan
    SENSORS, 2023, 23 (07)