Enhancing Label Correlation Feedback in Multi-Label Text Classification via Multi-Task Learning

被引:0
|
作者
Zhang, Ximing [1 ]
Zhang, Qian-Wen [2 ]
Yan, Zhao [2 ]
Liu, Ruifang [1 ]
Cao, Yunbo [2 ]
机构
[1] Beijing Univ Posts & Telecommun, Beijing 100876, Peoples R China
[2] Tencent Cloud Xiaowei, Beijing 100080, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In multi-label text classification (MLTC), each given document is associated with a set of correlated labels. To capture label correlations, previous classifier-chain and sequence-to-sequence models transform MLTC to a sequence prediction task. However, they tend to suffer from label order dependency, label combination over-fitting and error propagation problems. To address these problems, we introduce a novel approach with multi-task learning to enhance label correlation feedback. We first utilize a joint embedding (JE) mechanism to obtain the text and label representation simultaneously. In MLTC task, a document-label cross attention (CA) mechanism is adopted to generate a more discriminative document representation. Furthermore, we propose two auxiliary label co-occurrence prediction tasks to enhance label correlation learning: 1) Pairwise Label Co-occurrence Prediction (PLCP), and 2) Conditional Label Co-occurrence Prediction (CLCP). Experimental results on AAPD and RCV1-V2 datasets show that our method outperforms competitive baselines by a large margin. We analyze low-frequency label performance, label dependency, label combination diversity and coverage speed to show the effectiveness of our proposed method on label correlation learning. Our code is available at https://github.com/EiraZhang/LACO.
引用
收藏
页码:1190 / 1200
页数:11
相关论文
共 50 条
  • [1] Multi-label text classification via joint learning from label embedding and label correlation
    Liu, Huiting
    Chen, Geng
    Li, Peipei
    Zhao, Peng
    Wu, Xindong
    [J]. NEUROCOMPUTING, 2021, 460 : 385 - 398
  • [2] Multi-label emotion classification based on adversarial multi-task learning
    Lin, Nankai
    Fu, Sihui
    Lin, Xiaotian
    Wang, Lianxi
    [J]. INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (06)
  • [3] Multi-task multi-label multiple instance learning
    Shen, Yi
    Fan, Jian-ping
    [J]. JOURNAL OF ZHEJIANG UNIVERSITY-SCIENCE C-COMPUTERS & ELECTRONICS, 2010, 11 (11): : 860 - 871
  • [5] A COMBINED APPROACH TO MULTI-LABEL MULTI-TASK LEARNING
    Motamedvaziri, D.
    Saligrama, V.
    Castanon, D.
    [J]. 2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 616 - 619
  • [6] Multi-Task Label Embedding for Text Classification
    Zhang, Honglun
    Xiao, Liqiang
    Chen, Wenqing
    Wang, Yongkun
    Jin, Yaohui
    [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4545 - 4553
  • [7] Multi-task Hierarchical Cross-Attention Network for Multi-label Text Classification
    Lu, Junyu
    Zhang, Hao
    Shen, Zhexu
    Shi, Kaiyuan
    Yang, Liang
    Xu, Bo
    Zhang, Shaowu
    Lin, Hongfei
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT II, 2022, 13552 : 156 - 167
  • [8] Multi-Label Text Classification Based on Contrastive and Correlation Learning
    Yang, Shuo
    Gao, Shu
    [J]. PROCEEDINGS OF 2024 3RD INTERNATIONAL CONFERENCE ON CYBER SECURITY, ARTIFICIAL INTELLIGENCE AND DIGITAL ECONOMY, CSAIDE 2024, 2024, : 325 - 330
  • [9] Multi-task Joint Feature Selection for Multi-label Classification
    He Zhifen
    Yang Ming
    Liu Huidong
    [J]. CHINESE JOURNAL OF ELECTRONICS, 2015, 24 (02) : 281 - 287
  • [10] Multi-task Joint Feature Selection for Multi-label Classification
    HE Zhifen
    YANG Ming
    LIU Huidong
    [J]. Chinese Journal of Electronics, 2015, 24 (02) : 281 - 287