GACaps-HTC: graph attention capsule network for hierarchical text classification

被引:0
|
作者
Jinhyun Bang
Jonghun Park
Jonghyuk Park
机构
[1] Seoul National University,Department of Industrial Engineering & Institute for Industrial Systems Innovation
[2] Kookmin University,Department of AI, Big Data & Management
来源
Applied Intelligence | 2023年 / 53卷
关键词
Hierarchical text classification; Graph neural network; Capsule network; Attention mechanism; Natural language processing;
D O I
暂无
中图分类号
学科分类号
摘要
Hierarchical text classification has been receiving increasing attention due to its vast range of applications in real-world natural language processing tasks. While previous approaches have focused on effectively exploiting the label hierarchy for classification or capturing latent label relationships, few studies have integrated these concepts. In this work, we propose a graph attention capsule network for hierarchical text classification (GACaps-HTC), designed to capture both the explicit hierarchy and implicit relationships of labels. A graph attention network is employed to incorporate the information on the label hierarchy into a textual representation, whereas a capsule network infers classification probabilities by understanding the latent label relationships via iterative updates. The proposed approach is optimized using a loss term designed to address the innate label imbalance issue of the task. Experiments were conducted on two widely used text classification datasets, the WOS-46985 dataset and the RCV1 dataset. The results reveal that the proposed approach achieved a 0.6% gain and a 2.0% gain in micro-F1 and macro-F1 scores, respectively, on the WOS-46985 dataset and a 0.3% gain and a 2.2% gain in micro-F1 and macro-F1 scores, respectively, on the RCV1 dataset compared to the previous state-of-the-art approaches. Further ablation studies show that each component in GACaps-HTC played a part in enhancing the classification performance.
引用
收藏
页码:20577 / 20594
页数:17
相关论文
共 50 条
  • [1] GACaps-HTC: graph attention capsule network for hierarchical text classification
    Bang, Jinhyun
    Park, Jonghun
    Park, Jonghyuk
    [J]. APPLIED INTELLIGENCE, 2023, 53 (17) : 20577 - 20594
  • [2] Recurrent Attention Capsule Network for Text Classification
    Guan, Huanmei
    Liu, Jun
    Wu, Yujia
    Li, Ni
    [J]. 2019 6TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND CONTROL ENGINEERING (ICISCE 2019), 2019, : 444 - 448
  • [3] Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
    Jia, Xudong
    Wang, Li
    [J]. PeerJ Computer Science, 2022, 7
  • [4] Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
    Jia, Xudong
    Wang, Li
    [J]. PEERJ COMPUTER SCIENCE, 2022, 8
  • [5] Text Classification with Attention Gated Graph Neural Network
    Zhaoyang Deng
    Chenxiang Sun
    Guoqiang Zhong
    Yuxu Mao
    [J]. Cognitive Computation, 2022, 14 : 1464 - 1473
  • [6] Text Classification with Attention Gated Graph Neural Network
    Deng, Zhaoyang
    Sun, Chenxiang
    Zhong, Guoqiang
    Mao, Yuxu
    [J]. COGNITIVE COMPUTATION, 2022, 14 (04) : 1464 - 1473
  • [7] Label-Correction Capsule Network for Hierarchical Text Classification
    Zhao, Fei
    Wu, Zhen
    He, Liang
    Dai, Xin-Yu
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 2158 - 2168
  • [8] HCapsNet: A Text Classification Model Based on Hierarchical Capsule Network
    Li, Ying
    Ye, Ming
    Hu, Qian
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2021, PT II, 2021, 12816 : 538 - 549
  • [9] BiGRU attention capsule neural network for persian text classification
    Kenarang, Amir
    Farahani, Mehrdad
    Manthouri, Mohammad
    [J]. JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2022, 13 (8) : 3923 - 3933
  • [10] Tag recommendation by text classification with attention -based capsule network
    Lei, Kai
    Fu, Qiuai
    Yang, Min
    Liang, Yuzhi
    [J]. NEUROCOMPUTING, 2020, 391 : 65 - 73