Hierarchical multi-attention networks for document classification

被引:27
|
作者
Huang, Yingren [1 ,2 ]
Chen, Jiaojiao [2 ]
Zheng, Shaomin [2 ]
Xue, Yun [2 ]
Hu, Xiaohui [2 ]
机构
[1] Guangdong Univ Foreign Studies, Lab Language Engn & Comp, Guangzhou, Guangdong, Peoples R China
[2] South China Normal Univ, Guangdong Prov Key Lab Quantum Engn & Quantum Mat, Sch Phys & Telecommun Engn, Guangzhou 510006, Peoples R China
关键词
Document classification; Hierarchical network; Bi-GRU; Attention mechanism; SYSTEM;
D O I
10.1007/s13042-020-01260-x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Research of document classification is ongoing to employ the attention based-deep learning algorithms and achieves impressive results. Owing to the complexity of the document, classical models, as well as single attention mechanism, fail to meet the demand of high-accuracy classification. This paper proposes a method that classifies the document via the hierarchical multi-attention networks, which describes the document from the word-sentence level and the sentence-document level. Further, different attention strategies are performed on different levels, which enables accurate assigning of the attention weight. Specifically, the soft attention mechanism is applied to the word-sentence level while the CNN-attention to the sentence-document level. Due to the distinctiveness of the model, the proposed method delivers the highest accuracy compared to other state-of-the-art methods. In addition, the attention weight visualization outcomes present the effectiveness of attention mechanism in distinguishing the importance.
引用
收藏
页码:1639 / 1647
页数:9
相关论文
共 50 条
  • [1] Hierarchical multi-attention networks for document classification
    Yingren Huang
    Jiaojiao Chen
    Shaomin Zheng
    Yun Xue
    Xiaohui Hu
    International Journal of Machine Learning and Cybernetics, 2021, 12 : 1639 - 1647
  • [2] Recurrent Networks for Guided Multi-Attention Classification
    Dai, Xin
    Kong, Xiangnan
    Guo, Tian
    Lee, John Boaz
    Liu, Xinyue
    Moore, Constance
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 412 - 420
  • [3] Hierarchical Hybrid Neural Networks With Multi-Head Attention for Document Classification
    Huang, Weihao
    Chen, Jiaojiao
    Cai, Qianhua
    Liu, Xuejie
    Zhang, Yudong
    Hu, Xiaohui
    INTERNATIONAL JOURNAL OF DATA WAREHOUSING AND MINING, 2022, 18 (01)
  • [4] Topic-aware hierarchical multi-attention network for text classification
    Ye Jiang
    Yimin Wang
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 1863 - 1875
  • [5] Hierarchical Attention Transformer Networks for Long Document Classification
    Hu, Yongli
    Chen, Puman
    Liu, Tengfei
    Gao, Junbin
    Sun, Yanfeng
    Yin, Baocai
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] Topic-aware hierarchical multi-attention network for text classification
    Jiang, Ye
    Wang, Yimin
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (05) : 1863 - 1875
  • [7] Residual networks with multi-attention mechanism for hyperspectral image classification
    Shao Y.
    Lan J.
    Liang Y.
    Hu J.
    Arabian Journal of Geosciences, 2021, 14 (4)
  • [8] Hierarchical Multi-Attention Transfer for Knowledge Distillation
    Gou, Jianping
    Sun, Liyuan
    Yu, Baosheng
    Wan, Shaohua
    Tao, Dacheng
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2024, 20 (02)
  • [9] Hierarchical Self-Attention Hybrid Sparse Networks for Document Classification
    Huang, Weichun
    Tao, Ziqiang
    Huang, Xiaohui
    Xiong, Liyan
    Yu, Jia
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2021, 2021
  • [10] Targeted sentiment classification with multi-attention network
    Tian X.
    Liu P.
    Zhu Z.
    International Journal of Wireless and Mobile Computing, 2022, 23 (3-4) : 231 - 238