SAKE: A Graph-Based Keyphrase Extraction Method Using Self-attention

被引:0
|
作者
Zhu, Ping [1 ]
Gong, Chuanyang [1 ]
Wei, Zhihua [1 ]
机构
[1] Tongji Univ, Coll Elect & Informat Engn, Shanghai, Peoples R China
关键词
Keywords extraction; Self attention; Pre-trained model;
D O I
10.1007/978-3-031-08530-7_28
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Keyphrase extraction is a text analysis technique that automatically extracts the most used and most important words and expressions from a text. It helps summarize the content of texts and recognize the main topics discussed. The majority of the existing techniques are mainly domain-specific, which require application domain knowledge and employ higher-order statistical methods. Supervised keyphrase extraction requires a large amount of labeled training data and has poor generalization ability outside the training data domain. Unsupervised systems have poor accuracy, and often do not generalize well. This paper proposes an unsupervised graph-based keyphrase extraction model that incorporates the words' self-attention score. Specifically, the proposed approach identifies the importance of each source word based on a word graph built by the self-attention layer in the Transformer and further introduces a new mechanism to capture the relationships between words in different sentences. The experimental results show that the proposed approach achieves remarkable improvements over the state-of-the-art models.
引用
收藏
页码:339 / 350
页数:12
相关论文
共 50 条
  • [41] A Graph-Based Relation Extraction Method for Question Answering System
    Veena, G.
    Gupta, Deepa
    Athulya, S.
    Shaji, Salma
    2017 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATIONS AND INFORMATICS (ICACCI), 2017, : 944 - 949
  • [42] Tailoring Self-Attention for Graph via Rooted Subtrees
    Huang, Siyuan
    Song, Yunchong
    Zhou, Jiayue
    Lin, Zhouhan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [43] On the Global Self-attention Mechanism for Graph Convolutional Networks
    Wang, Chen
    Deng, Chengyuan
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 8531 - 8538
  • [44] Graph neural network with self-attention for material discovery
    Chen, Xuesi
    Jiang, Hantong
    Lin, Xuanjie
    Ren, Yongsheng
    Wu, Congzhong
    Zhan, Shu
    Ma, Wenhui
    MOLECULAR PHYSICS, 2023, 121 (04)
  • [45] SELF-ATTENTION EQUIPPED GRAPH CONVOLUTIONS FOR DISEASE PREDICTION
    Kazi, Anees
    Krishna, S. Arvind
    Shekarforoush, Shayan
    Kortuem, Karsten
    Albarqouni, Shadi
    Navab, Nassir
    2019 IEEE 16TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI 2019), 2019, : 1896 - 1899
  • [46] Conversational Emotion Recognition Using Self-Attention Mechanisms and Graph Neural Networks
    Lian, Zheng
    Tao, Jianhua
    Liu, Bin
    Huang, Jian
    Yang, Zhanlei
    Li, Rongjun
    INTERSPEECH 2020, 2020, : 2347 - 2351
  • [47] Crowd counting method based on the self-attention residual network
    Liu, Yan-Bo
    Jia, Rui-Sheng
    Liu, Qing-Ming
    Zhang, Xing-Li
    Sun, Hong-Mei
    APPLIED INTELLIGENCE, 2021, 51 (01) : 427 - 440
  • [48] CGSNet: Contrastive Graph Self-Attention Network for Session-based Recommendation
    Wang, Fuyun
    Lu, Xuequan
    Lyu, Lei
    KNOWLEDGE-BASED SYSTEMS, 2022, 251
  • [49] Joint extraction of entities and relations based on character graph convolutional network and Multi-Head Self-Attention Mechanism
    Meng, Zhao
    Tian, Shengwei
    Yu, Long
    Lv, Yalong
    JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2021, 33 (02) : 349 - 362
  • [50] A graph-based edge attention gate medical image segmentation method
    Hao, Dechen
    Li, Hualing
    IET IMAGE PROCESSING, 2023, 17 (07) : 2142 - 2157