Text Classification Model Based on Graph Attention Networks and Adversarial Training

被引:0
|
作者
Li, Jing [1 ]
Jian, Yumei [1 ]
Xiong, Yujie [1 ]
机构
[1] Shanghai Univ Engn Sci, Sch Elect & Elect Engn, Shanghai 201620, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 11期
关键词
Chinese short text classification; graph attention networks; attention mechanism; adversarial training; feature fusion; ALGORITHM;
D O I
10.3390/app14114906
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Featured Application Public Opinion Analysis.Abstract Text information on the internet often has a strong sense of immediacy, constantly reflecting societal dynamics and evolving events. This is especially crucial in the field of news text, where the classification and analysis of these immediate and varied text data become essential. Existing text classification models frequently struggle to effectively represent the semantic and local feature information of texts, limiting their effectiveness. The primary challenge lies in improving the representation of both semantic and local feature information in text classification models, which is critical for capturing the nuanced meanings in rapidly evolving news texts. This paper proposes a deep learning-driven framework designed to enhance the effectiveness of text classification models. The method incorporates noise perturbation during training for adversarial training, thereby enhancing the model's generalization ability on original samples and increasing its robustness. A graph attention network is employed to extract the contextual semantic information of vocabulary from sequential texts. This information is then combined with extracted sentence feature information to enrich the feature representation of the sequence. An attention mechanism is also introduced to extract more critical feature information from the text, thereby deepening the understanding of textual semantic information. Experimental results demonstrate that this method successfully integrates the boundary and semantic information of vocabulary into the classification task. The approach comprehensively and deeply mines the semantic features of the text, leading to improved classification performance.
引用
收藏
页数:17
相关论文
共 50 条
  • [41] Graph neural networks for text classification: a survey
    Wang, Kunze
    Ding, Yihao
    Han, Soyeon Caren
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2024, 57 (08)
  • [42] Text Classification Based on Convolutional Neural Network and Attention Model
    Yang, Shuang
    Tang, Yan
    [J]. 2020 3RD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND BIG DATA (ICAIBD 2020), 2020, : 67 - 73
  • [43] Short Text Classification Model Based on Multi-Attention
    Liu, Yunxiang
    Xu, Qi
    [J]. 2020 13TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID 2020), 2020, : 225 - 229
  • [44] Text Classification Research with Attention-based Recurrent Neural Networks
    Du, C.
    Huang, L.
    [J]. INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL, 2018, 13 (01) : 50 - 61
  • [45] A Convolutional Attention Model for Text Classification
    Du, Jiachen
    Gui, Lin
    Xu, Ruifeng
    He, Yulan
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2017, 2018, 10619 : 183 - 195
  • [46] Hierarchical Convolutional Attention Networks for Text Classification
    Gao, Shang
    Ramanathan, Arvind
    Tourassi, Georgia
    [J]. REPRESENTATION LEARNING FOR NLP, 2018, : 11 - 23
  • [47] Global Attention-Based Graph Neural Networks for Node Classification
    Chen, Jiusheng
    Fang, Chengyuan
    Zhang, Xiaoyu
    [J]. NEURAL PROCESSING LETTERS, 2023, 55 (04) : 4127 - 4150
  • [48] Global Attention-Based Graph Neural Networks for Node Classification
    Jiusheng Chen
    Chengyuan Fang
    Xiaoyu Zhang
    [J]. Neural Processing Letters, 2023, 55 : 4127 - 4150
  • [49] Superpixel Image Classification with Graph Attention Networks
    Avelar, Pedro H. C.
    Tavares, Anderson R.
    da Silveira, Thiago L. T.
    Jung, Cliudio R.
    Lamb, Luis C.
    [J]. 2020 33RD SIBGRAPI CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI 2020), 2020, : 203 - 209
  • [50] Backpropagation Computation for Training Graph Attention Networks
    Joe Gould
    Keshab K. Parhi
    [J]. Journal of Signal Processing Systems, 2024, 96 : 1 - 14