Graph Receptive Transformer Encoder for Text Classification

被引:0
|
作者
Aras, Arda Can [1 ,2 ]
Alikaşifoğlu, Tuna [1 ,2 ]
Koç, Aykut [1 ,2 ]
机构
[1] The Department of Electrical and Electronics Engineering, Bilkent University, Ankara,06800, Turkey
[2] UMRAM, Bilkent University, Ankara,06800, Turkey
关键词
Classification (of information) - Graph neural networks - Job analysis - Network coding - Text processing;
D O I
暂无
中图分类号
学科分类号
摘要
By employing attention mechanisms, transformers have made great improvements in nearly all NLP tasks, including text classification. However, the context of the transformer’s attention mechanism is limited to single sequences, and their fine-tuning stage can utilize only inductive learning. Focusing on broader contexts by representing texts as graphs, previous works have generalized transformer models to graph domains to employ attention mechanisms beyond single sequences. However, these approaches either require exhaustive pre-training stages, learn only transductively, or can learn inductively without utilizing pre-trained models. To address these problems simultaneously, we propose the Graph Receptive Transformer Encoder (GRTE), which combines graph neural networks (GNNs) with large-scale pre-trained models for text classification in both inductive and transductive fashions. By constructing heterogeneous and homogeneous graphs over given corpora and not requiring a pre-training stage, GRTE can utilize information from both large-scale pre-trained models and graph-structured relations. Our proposed method retrieves global and contextual information in documents and generates word embeddings as a by-product of inductive inference. We compared the proposed GRTE with a wide range of baseline models through comprehensive experiments. Compared to the state-of-the-art, we demonstrated that GRTE improves model performances and offers computational savings up to ~100×. © 2024 IEEE.
引用
收藏
页码:347 / 359
相关论文
共 50 条
  • [1] Graph Receptive Transformer Encoder for Text Classification
    Aras, Arda Can
    Alikasifoglu, Tuna
    Koc, Aykut
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2024, 10 : 347 - 359
  • [2] Text Graph Transformer for Document Classification
    Zhang, Haopeng
    Zhang, Jiawei
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 8322 - 8327
  • [3] Transformer and Graph Convolutional Network for Text Classification
    Boting Liu
    Weili Guan
    Changjin Yang
    Zhijie Fang
    Zhiheng Lu
    International Journal of Computational Intelligence Systems, 16
  • [4] Transformer and Graph Convolutional Network for Text Classification
    Liu, Boting
    Guan, Weili
    Yang, Changjin
    Fang, Zhijie
    Lu, Zhiheng
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2023, 16 (01)
  • [5] Relation classification via knowledge graph enhanced transformer encoder
    Huang, Wenti
    Mao, Yiyu
    Yang, Zhan
    Zhu, Lei
    Long, Jun
    KNOWLEDGE-BASED SYSTEMS, 2020, 206
  • [6] Inductive Topic Variational Graph Auto-Encoder for Text Classification
    Xie, Qianqian
    Huang, Jimin
    Du, Pan
    Peng, Min
    Nie, Jian-Yun
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 4218 - 4227
  • [7] Heterogeneous Graph Transformer for Meta-structure Learning with Application in Text Classification
    Wang, Shuhai
    Liu, Xin
    Pan, Xiao
    Xu, Hanjie
    Liu, Mingrui
    ACM TRANSACTIONS ON THE WEB, 2023, 17 (03)
  • [8] Encoder embedding for general graph and node classification
    Shen, Cencheng
    Applied Network Science, 2024, 9 (01)
  • [9] Multi-Encoder Transformer for Korean Abstractive Text Summarization
    Shin, Youhyun
    IEEE ACCESS, 2023, 11 : 48768 - 48782
  • [10] DUAL TRANSFORMER ENCODER MODEL FOR MEDICAL IMAGE CLASSIFICATION
    Yan, Fangyuan
    Yan, Bin
    Pei, Mingtao
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 690 - 694