Hierarchical multi-attention networks for document classification

被引:1
|
作者
Yingren Huang
Jiaojiao Chen
Shaomin Zheng
Yun Xue
Xiaohui Hu
机构
[1] Guangdong University of Foreign Studies,Laboratory of Language Engineering and Computing
[2] South China Normal University,Guangdong Provincial Key Laboratory of Quantum Engineering and Quantum Materials, School of Physics and Telecommunication Engineering
关键词
Document classification; Hierarchical network; Bi-GRU; Attention mechanism;
D O I
暂无
中图分类号
学科分类号
摘要
Research of document classification is ongoing to employ the attention based-deep learning algorithms and achieves impressive results. Owing to the complexity of the document, classical models, as well as single attention mechanism, fail to meet the demand of high-accuracy classification. This paper proposes a method that classifies the document via the hierarchical multi-attention networks, which describes the document from the word-sentence level and the sentence-document level. Further, different attention strategies are performed on different levels, which enables accurate assigning of the attention weight. Specifically, the soft attention mechanism is applied to the word-sentence level while the CNN-attention to the sentence-document level. Due to the distinctiveness of the model, the proposed method delivers the highest accuracy compared to other state-of-the-art methods. In addition, the attention weight visualization outcomes present the effectiveness of attention mechanism in distinguishing the importance.
引用
收藏
页码:1639 / 1647
页数:8
相关论文
共 50 条
  • [41] Patent Citation Dynamics Modeling via Multi-Attention Recurrent Networks
    Ji, Taoran
    Chen, Zhiqian
    Self, Nathan
    Fu, Kaiqun
    Lu, Chang-Tien
    Ramakrishnan, Naren
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 2621 - 2627
  • [42] Multi-Attention Segmentation Networks Combined with the Sobel Operator for Medical Images
    Lu, Fangfang
    Tang, Chi
    Liu, Tianxiang
    Zhang, Zhihao
    Li, Leida
    SENSORS, 2023, 23 (05)
  • [43] Multiscale transformers and multi-attention mechanism networks for pathological nuclei segmentation
    Yongzhao Du
    Xin Chen
    Yuqing Fu
    Scientific Reports, 15 (1)
  • [44] Hierarchical Attentional Hybrid Neural Networks for Document Classification
    Abreu, Jader
    Fred, Luis
    Macedo, David
    Zanchettin, Cleber
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: WORKSHOP AND SPECIAL SESSIONS, 2019, 11731 : 396 - 402
  • [45] REFORMIST: Hierarchical Attention Networks for Multi-Domain Sentiment Classification with Active Learning
    Katsarou, Katerina
    Douss, Nabil
    Stefanidis, Kostas
    38TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2023, 2023, : 919 - 928
  • [46] Double-Branch Multi-Attention Mechanism Network for Hyperspectral Image Classification
    Ma, Wenping
    Yang, Qifan
    Wu, Yue
    Zhao, Wei
    Zhang, Xiangrong
    REMOTE SENSING, 2019, 11 (11)
  • [47] CBMAFM: CNN-BiLSTM Multi-Attention Fusion Mechanism for sentiment classification
    Wankhade, Mayur
    Annavarapu, Chandra Sekhara Rao
    Abraham, Ajith
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (17) : 51755 - 51786
  • [48] Multi-attention fusion and weighted class representation for few-shot classification
    赵文仓
    QIN Wenqian
    LI Ming
    HighTechnologyLetters, 2022, 28 (03) : 295 - 306
  • [49] Multi-attention mechanism based on gate recurrent unit for English text classification
    Liu, Haiying
    EAI ENDORSED TRANSACTIONS ON SCALABLE INFORMATION SYSTEMS, 2022, 9 (04):
  • [50] A NOVEL MULTI-ATTENTION DRIVEN SYSTEM FOR MULTI-LABEL REMOTE SENSING IMAGE CLASSIFICATION
    Sumbul, Gencer
    Demir, Begum
    2019 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2019), 2019, : 5726 - 5729