GATES: Using Graph Attention Networks for Entity Summarization

被引:1
|
作者
Firmansyah, Asep Fajar [1 ]
Moussallem, Diego [1 ,2 ]
Ngomo, Axel-Cyrille Ngonga [1 ]
机构
[1] Paderborn Univ, Data Sci Res Grp, Paderborn, Germany
[2] Globo, Rio De Janeiro, Brazil
关键词
Entity Summarization; Graph Attention Network; Knowledge Graph Embeddings; Text Embeddings;
D O I
10.1145/3460210.3493574
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The sheer size of modern knowledge graphs has led to increased attention being paid to the entity summarization task. Given a knowledge graph T and an entity e found therein, solutions to entity summarization select a subset of the triples from T which summarize e's concise bound description. Presently, the best performing approaches rely on sequence-to-sequence models to generate entity summaries and use little to none of the structure information of T during the summarization process. We hypothesize that this structure information can be exploited to compute better summaries. To verify our hypothesis, we propose GATES, a new entity summarization approach that combines topological information and knowledge graph embeddings to encode triples. The topological information is encoded by means of a Graph Attention Network. Furthermore, ensemble learning is applied to boost the performance of triple scoring. We evaluate GATES on the DBpedia and LMDB datasets from ESBM (version 1.2), as well as on the FACES datasets. Our results show that GATES outperforms the state-of-the-art approaches on 4 of 6 configuration settings and reaches up to 0.574 F-measure. Pertaining to resulted summaries quality, GATES still underperforms the state of the arts as it obtains the highest score only on 1 of 6 configuration settings at 0.697 NDCG score. An open-source implementation of our approach and of the code necessary to rerun our experiments are available at https://github.com/dice-group/GATES.
引用
收藏
页码:73 / 80
页数:8
相关论文
共 50 条
  • [1] Automatic source code summarization with graph attention networks
    Zhou, Yu
    Shen, Juanjuan
    Zhang, Xiaoqing
    Yang, Wenhua
    Han, Tingting
    Chen, Taolue
    [J]. JOURNAL OF SYSTEMS AND SOFTWARE, 2022, 188
  • [2] Entity Resolution with Hierarchical Graph Attention Networks
    Yao, Dezhong
    Gu, Yuhong
    Cong, Gao
    Jin, Hai
    Lv, Xinqiao
    [J]. PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 429 - 442
  • [3] Gated Graph Neural Attention Networks for abstractive summarization
    Liang, Zeyu
    Du, Junping
    Shao, Yingxia
    Ji, Houye
    [J]. NEUROCOMPUTING, 2021, 431 : 128 - 136
  • [4] Graph Summarization for Entity Relatedness Visualization
    Miao, Yukai
    Qin, Jianbin
    Wang, Wei
    [J]. SIGIR'17: PROCEEDINGS OF THE 40TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2017, : 1161 - 1164
  • [5] SELECTIVE ATTENTION ENCODERS BY SYNTACTIC GRAPH CONVOLUTIONAL NETWORKS FOR DOCUMENT SUMMARIZATION
    Xu, Haiyang
    Wang, Yun
    Han, Kun
    Ma, Baochang
    Chen, Junwen
    Li, Xiangang
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 8219 - 8223
  • [6] Graph Attention Networks Adjusted Bi-LSTM for Video Summarization
    Zhong, Rui
    Wang, Rui
    Zou, Yang
    Hong, Zhiqiang
    Hu, Min
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 663 - 667
  • [7] Graph convolutional and attention models for entity classification in multilayer networks
    Lorenzo Zangari
    Roberto Interdonato
    Antonio Calió
    Andrea Tagarelli
    [J]. Applied Network Science, 6
  • [8] Graph convolutional and attention models for entity classification in multilayer networks
    Zangari, Lorenzo
    Interdonato, Roberto
    Calio, Antonio
    Tagarelli, Andrea
    [J]. APPLIED NETWORK SCIENCE, 2021, 6 (01)
  • [9] Multi-granularity heterogeneous graph attention networks for extractive document summarization
    Zhao, Yu
    Wang, Leilei
    Wang, Cui
    Du, Huaming
    Wei, Shaopeng
    Feng, Huali
    Yu, Zongjian
    Li, Qing
    [J]. NEURAL NETWORKS, 2022, 155 : 340 - 347
  • [10] Connecting Embeddings Based on Multiplex Relational Graph Attention Networks for Knowledge Graph Entity Typing
    Zhao, Yu
    Zhou, Han
    Zhang, Anxiang
    Xie, Ruobing
    Li, Qing
    Zhuang, Fuzhen
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (05) : 4608 - 4620