Gate-Attention and Dual-End Enhancement Mechanism for Multi-Label Text Classification

被引:0
|
作者
Cheng, Jieren [1 ,2 ]
Chen, Xiaolong [1 ]
Xu, Wenghang [3 ]
Hua, Shuai [3 ]
Tang, Zhu [1 ]
Sheng, Victor S. [4 ]
机构
[1] Hainan Univ, Sch Comp Sci & Technol, Haikou 570228, Peoples R China
[2] Hainan Univ, Hainan Blockchain Technol Engn Res Ctr, Haikou 570228, Peoples R China
[3] Hainan Univ, Sch Cyberspace Secur, Haikou 570228, Peoples R China
[4] Texas Tech Univ, Dept Comp Sci, Lubbock, TX 79409 USA
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2023年 / 77卷 / 02期
基金
中国国家自然科学基金;
关键词
Multi-label text classification; feature extraction; label distribution information; sequence generation;
D O I
10.32604/cmc.2023.042980
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the realm of Multi-Label Text Classification (MLTC), the dual challenges of extracting rich semantic features from text and discerning inter-label relationships have spurred innovative approaches. Many studies in semantic feature extraction have turned to external knowledge to augment the model's grasp of textual content, often overlooking intrinsic textual cues such as label statistical features. In contrast, these endogenous insights naturally align with the classification task. In our paper, to complement this focus on intrinsic knowledge, we introduce a novel Gate-Attention mechanism. This mechanism adeptly integrates statistical features from the text itself into the semantic fabric, enhancing the model's capacity to understand and represent the data. Additionally, to address the intricate task of mining label correlations, we propose a Dual-end enhancement mechanism. This mechanism effectively mitigates the challenges of information loss and erroneous transmission inherent in traditional long short term memory propagation. We conducted an extensive battery of experiments on the AAPD and RCV1-2 datasets. These experiments serve the dual purpose of confirming the efficacy of both the Gate-Attention mechanism and the Dual-end enhancement mechanism. Our final model unequivocally outperforms the baseline model, attesting to its robustness. These findings emphatically underscore the imperativeness of taking into account not just external knowledge but also the inherent intricacies of textual data when crafting potent MLTC models.
引用
收藏
页码:1779 / 1793
页数:15
相关论文
共 50 条
  • [1] Multi-Label Text Classification Combining Bidirectional Attention and Contrast Enhancement Mechanism
    Li, Jiandong
    Fu, Jia
    Li, Jiaqi
    Computer Engineering and Applications, 2024, 60 (16) : 105 - 115
  • [2] All is attention for multi-label text classification
    Liu, Zhi
    Huang, Yunjie
    Xia, Xincheng
    Zhang, Yihao
    KNOWLEDGE AND INFORMATION SYSTEMS, 2025, 67 (02) : 1249 - 1270
  • [3] Multi-Label Text Classification model integrating Label Attention and Historical Attention
    Sun, Guoying
    Cheng, Yanan
    Dong, Fangzhou
    Wang, Luhua
    Zhao, Dong
    Zhang, Zhaoxin
    Tong, Xiaojun
    KNOWLEDGE-BASED SYSTEMS, 2024, 296
  • [4] Multi-label legal text classification with BiLSTM and attention
    Enamoto, Liriam
    Santos, Andre R. A. S.
    Maia, Ricardo
    Weigang, Li
    Rocha Filho, Geraldo P.
    INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY, 2022, 68 (04) : 369 - 378
  • [5] A novel reasoning mechanism for multi-label text classification
    Wang, Ran
    Ridley, Robert
    Su, Xi'ao
    Qu, Weiguang
    Dai, Xinyu
    INFORMATION PROCESSING & MANAGEMENT, 2021, 58 (02)
  • [6] Incorporating keyword extraction and attention for multi-label text classification
    Zhao, Hua
    Li, Xiaoqian
    Wang, Fengling
    Zeng, Qingtian
    Diao, Xiuli
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 45 (02) : 2083 - 2093
  • [7] Research of multi-label text classification based on label attention and correlation networks
    Yuan, Ling
    Xu, Xinyi
    Sun, Ping
    Yu, Hai ping
    Wei, Yin Zhen
    Zhou, Jun jie
    PLOS ONE, 2024, 19 (09):
  • [8] Multi-label Text Classification Based on BiGRU and Multi-Head Self-Attention Mechanism
    Luo, Tongtong
    Shi, Nan
    Jin, Meilin
    Qin, Aolong
    Tang, Jiacheng
    Wang, Xihan
    Gao, Quanli
    Shao, Lianhe
    2024 3RD INTERNATIONAL CONFERENCE ON IMAGE PROCESSING AND MEDIA COMPUTING, ICIPMC 2024, 2024, : 204 - 210
  • [9] DATran: Dual Attention Transformer for Multi-Label Image Classification
    Zhou, Wei
    Zheng, Zhijie
    Su, Tao
    Hu, Haifeng
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (01) : 342 - 356
  • [10] Multi-label Text Classification Combining BERT and Bi-GRU Based on the Attention Mechanism
    Tian, Ying
    Journal of Network Intelligence, 2023, 8 (01): : 168 - 181