Integrating information by Kullback-Leibler constraint for text classification

被引:1
|
作者
Yin, Shu [1 ,2 ]
Zhu, Peican [2 ]
Wu, Xinyu [3 ]
Huang, Jiajin [4 ]
Li, Xianghua [2 ]
Wang, Zhen [2 ]
Gao, Chao [2 ,3 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Shannxi, Peoples R China
[2] Northwestern Polytech Univ, Sch Artificial Intelligence Opt & Elect iOPEN, Xian 710072, Shannxi, Peoples R China
[3] Southwest Univ, Coll Comp & Informat Sci, Chongqing 400715, Peoples R China
[4] Beijing Univ Technol, Fac Informat Technol, Beijing 100083, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2023年 / 35卷 / 24期
基金
中国国家自然科学基金;
关键词
Text classification; Graph neural network; Kullback-Leibler divergence; Constraint; CONVOLUTIONAL NEURAL-NETWORKS;
D O I
10.1007/s00521-023-08602-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Text classification is an important assignment for various text-related downstream assignments, such as fake news detection, sentiment analysis, and question answering. In recent years, the graph-based method achieves excellent results in text classification tasks. Instead of regarding a text as a sequence structure, this method regards it as a co-occurrence set of words. The task of text classification is then accomplished by aggregating the data from nearby nodes using the graph neural network. However, existing corpus-level graph models are difficult to incorporate the local semantic information and classify new coming texts. To address these issues, we propose a Global-Local Text Classification (GLTC) model, based on the KL constraints to realize inductive learning for text classification. Firstly, a global structural feature extractor and a local semantic feature extractor are designed to capture the structural and semantic information of text comprehensively. Then, the KL divergence is introduced as a regularization term in the loss calculation process, which ensures that the global structural feature extractor can constrain the learning of the local semantic feature extractor to achieve inductive learning. The comprehensive experiments on benchmark datasets present that GLTC outperforms baseline methods in terms of accuracy.
引用
收藏
页码:17521 / 17535
页数:15
相关论文
共 50 条
  • [41] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    [J]. JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)
  • [42] The Kullback-Leibler number and thermodynamics
    Nyengeri, H
    [J]. PHYSICA SCRIPTA, 2001, 64 (02): : 105 - 107
  • [43] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    [J]. ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [44] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    [J]. CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349
  • [45] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    [J]. ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [46] On Kullback-Leibler information of order statistics in terms of the relative risk
    Park, Sangun
    [J]. METRIKA, 2014, 77 (05) : 609 - 616
  • [47] AUTOMATIC CLASSIFICATION OF ELECTROENCEPHALOGRAMS - KULLBACK-LEIBLER NEAREST NEIGHBOR RULES
    GERSCH, W
    MARTINELLI, F
    YONEMOTO, J
    LOW, MD
    MCEWAN, JA
    [J]. SCIENCE, 1979, 205 (4402) : 193 - 195
  • [48] Improving a Test for Normality Based on Kullback-Leibler Discrimination Information
    Choi, Byungjin
    [J]. KOREAN JOURNAL OF APPLIED STATISTICS, 2007, 20 (01) : 79 - 89
  • [49] On Hippocampus Associative Modeling by Approximating Nonlinear Kullback-Leibler Sparsity Constraint
    Ghosh, Sukanta
    Chandra, Abhijit
    Mudi, Rajanikanta
    [J]. 2021 INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND INFORMATICS (ICCCI), 2021,
  • [50] An iterative approach to variable selection based on the Kullback-Leibler information
    Hughes, AW
    King, ML
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1999, 28 (05) : 1043 - 1057