SAKE: A Graph-Based Keyphrase Extraction Method Using Self-attention

被引:0
|
作者
Zhu, Ping [1 ]
Gong, Chuanyang [1 ]
Wei, Zhihua [1 ]
机构
[1] Tongji Univ, Coll Elect & Informat Engn, Shanghai, Peoples R China
关键词
Keywords extraction; Self attention; Pre-trained model;
D O I
10.1007/978-3-031-08530-7_28
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Keyphrase extraction is a text analysis technique that automatically extracts the most used and most important words and expressions from a text. It helps summarize the content of texts and recognize the main topics discussed. The majority of the existing techniques are mainly domain-specific, which require application domain knowledge and employ higher-order statistical methods. Supervised keyphrase extraction requires a large amount of labeled training data and has poor generalization ability outside the training data domain. Unsupervised systems have poor accuracy, and often do not generalize well. This paper proposes an unsupervised graph-based keyphrase extraction model that incorporates the words' self-attention score. Specifically, the proposed approach identifies the importance of each source word based on a word graph built by the self-attention layer in the Transformer and further introduces a new mechanism to capture the relationships between words in different sentences. The experimental results show that the proposed approach achieves remarkable improvements over the state-of-the-art models.
引用
收藏
页码:339 / 350
页数:12
相关论文
共 50 条
  • [31] Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human Attention
    Zhang, Zeyu
    Li, Bin
    Yan, Chenyang
    Furuichi, Kengo
    Todo, Yuki
    BIOMIMETICS, 2025, 10 (01)
  • [32] Multimodal Fusion Method Based on Self-Attention Mechanism
    Zhu, Hu
    Wang, Ze
    Shi, Yu
    Hua, Yingying
    Xu, Guoxia
    Deng, Lizhen
    WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2020, 2020
  • [33] Graph contextualized self-attention network for session-based recommendation
    Xu, Chengfeng
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Zhuang, Fuzhen
    Fang, Junhua
    Zhou, Xiaofang
    IJCAI International Joint Conference on Artificial Intelligence, 2019, 2019-August : 3940 - 3946
  • [34] Cascade Prediction model based on Dynamic Graph Representation and Self-Attention
    Zhang F.
    Wang X.
    Wang R.
    Tang Q.
    Han Y.
    Dianzi Keji Daxue Xuebao/Journal of the University of Electronic Science and Technology of China, 2022, 51 (01): : 83 - 90
  • [35] Graph Contextualized Self-Attention Network for Session-based Recommendation
    Xu, Chengfeng
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Zhuang, Fuzhen
    Fang, Junhua
    Zhou, Xiaofang
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3940 - 3946
  • [36] Decoupling Identification Method of Continuous Working Conditions of Diesel Engines Based on a Graph Self-Attention Network
    Huang, Anzheng
    Bao, Binbin
    Zhao, Nanyang
    Zhang, Jinjie
    Jiang, Zhinong
    Mao, Zhiwei
    IEEE ACCESS, 2022, 10 : 36649 - 36661
  • [37] Frequency Stability Prediction Method Based on Modified Spatial Temporal Graph Convolutional Networks and Self-Attention
    Du, Donglai
    Han, Song
    Rong, Na
    Diangong Jishu Xuebao/Transactions of China Electrotechnical Society, 2024, 39 (16): : 4985 - 4995
  • [38] A Graph-Based Keyword Extraction Method for Academic Literature Knowledge Graph Construction
    Zhang, Lin
    Li, Yanan
    Li, Qinru
    MATHEMATICS, 2024, 12 (09)
  • [39] Unsupervised-learning-based keyphrase extraction from a single document by the effective combination of the graph-based model and the modified C-value method
    Yeom, Hongseon
    Ko, Youngjoong
    Seo, Jungyun
    COMPUTER SPEECH AND LANGUAGE, 2019, 58 : 304 - 318
  • [40] A hierarchical and interlamination graph self-attention mechanism-based knowledge graph reasoning architecture
    Wu, Yuejia
    Zhou, Jian-tao
    INFORMATION SCIENCES, 2025, 686