Hierarchical multi-attention networks for document classification

被引:1
|
作者
Yingren Huang
Jiaojiao Chen
Shaomin Zheng
Yun Xue
Xiaohui Hu
机构
[1] Guangdong University of Foreign Studies,Laboratory of Language Engineering and Computing
[2] South China Normal University,Guangdong Provincial Key Laboratory of Quantum Engineering and Quantum Materials, School of Physics and Telecommunication Engineering
关键词
Document classification; Hierarchical network; Bi-GRU; Attention mechanism;
D O I
暂无
中图分类号
学科分类号
摘要
Research of document classification is ongoing to employ the attention based-deep learning algorithms and achieves impressive results. Owing to the complexity of the document, classical models, as well as single attention mechanism, fail to meet the demand of high-accuracy classification. This paper proposes a method that classifies the document via the hierarchical multi-attention networks, which describes the document from the word-sentence level and the sentence-document level. Further, different attention strategies are performed on different levels, which enables accurate assigning of the attention weight. Specifically, the soft attention mechanism is applied to the word-sentence level while the CNN-attention to the sentence-document level. Due to the distinctiveness of the model, the proposed method delivers the highest accuracy compared to other state-of-the-art methods. In addition, the attention weight visualization outcomes present the effectiveness of attention mechanism in distinguishing the importance.
引用
收藏
页码:1639 / 1647
页数:8
相关论文
共 50 条
  • [1] Hierarchical multi-attention networks for document classification
    Huang, Yingren
    Chen, Jiaojiao
    Zheng, Shaomin
    Xue, Yun
    Hu, Xiaohui
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (06) : 1639 - 1647
  • [2] Recurrent Networks for Guided Multi-Attention Classification
    Dai, Xin
    Kong, Xiangnan
    Guo, Tian
    Lee, John Boaz
    Liu, Xinyue
    Moore, Constance
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 412 - 420
  • [3] Hierarchical Hybrid Neural Networks With Multi-Head Attention for Document Classification
    Huang, Weihao
    Chen, Jiaojiao
    Cai, Qianhua
    Liu, Xuejie
    Zhang, Yudong
    Hu, Xiaohui
    INTERNATIONAL JOURNAL OF DATA WAREHOUSING AND MINING, 2022, 18 (01)
  • [4] Topic-aware hierarchical multi-attention network for text classification
    Ye Jiang
    Yimin Wang
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 1863 - 1875
  • [5] Hierarchical Attention Transformer Networks for Long Document Classification
    Hu, Yongli
    Chen, Puman
    Liu, Tengfei
    Gao, Junbin
    Sun, Yanfeng
    Yin, Baocai
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] Topic-aware hierarchical multi-attention network for text classification
    Jiang, Ye
    Wang, Yimin
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (05) : 1863 - 1875
  • [7] Residual networks with multi-attention mechanism for hyperspectral image classification
    Shao Y.
    Lan J.
    Liang Y.
    Hu J.
    Arabian Journal of Geosciences, 2021, 14 (4)
  • [8] Hierarchical Multi-Attention Transfer for Knowledge Distillation
    Gou, Jianping
    Sun, Liyuan
    Yu, Baosheng
    Wan, Shaohua
    Tao, Dacheng
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2024, 20 (02)
  • [9] Hierarchical Self-Attention Hybrid Sparse Networks for Document Classification
    Huang, Weichun
    Tao, Ziqiang
    Huang, Xiaohui
    Xiong, Liyan
    Yu, Jia
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2021, 2021
  • [10] Targeted sentiment classification with multi-attention network
    Tian X.
    Liu P.
    Zhu Z.
    International Journal of Wireless and Mobile Computing, 2022, 23 (3-4) : 231 - 238