Concept Distillation in Graph Neural Networks

被引:3
|
作者
Magister, Lucie Charlotte [1 ]
Barbiero, Pietro [1 ,2 ]
Kazhdan, Dmitry [1 ]
Siciliano, Federico [3 ]
Ciravegna, Gabriele [4 ]
Silvestri, Fabrizio [3 ]
Jamnik, Mateja [1 ]
Lio, Pietro [1 ]
机构
[1] Univ Cambridge, Cambridge CB3 0FD, England
[2] Univ Svizzera Italiana, CH-6900 Lugano, Switzerland
[3] Univ Roma La Sapienza, I-00185 Rome, Italy
[4] Politecn Torino, I-10129 Turin, Italy
基金
欧盟地平线“2020”;
关键词
Explainability; Concepts; Graph Neural Networks;
D O I
10.1007/978-3-031-44070-0_12
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The opaque reasoning of Graph Neural Networks induces a lack of human trust. Existing graph network explainers attempt to address this issue by providing post-hoc explanations, however, they fail to make the model itself more interpretable. To fill this gap, we introduce the Concept Distillation Module, the first differentiable concept-distillation approach for graph networks. The proposed approach is a layer that can be plugged into any graph network to make it explainable by design, by first distilling graph concepts from the latent space and then using these to solve the task. Our results demonstrate that this approach allows graph networks to: (i) attain model accuracy comparable with their equivalent vanilla versions, (ii) distill meaningful concepts achieving 4.8% higher concept completeness and 36.5% lower purity scores on average, (iii) provide high-quality concept-based logic explanations for their prediction, and (iv) support effective interventions at test time: these can increase human trust as well as improve model performance.
引用
收藏
页码:233 / 255
页数:23
相关论文
共 50 条
  • [21] The Devil is in the Data: Learning Fair Graph Neural Networks via Partial Knowledge Distillation
    Zhu, Yuchang
    Li, Jintang
    Chen, Liang
    Zheng, Zibin
    [J]. PROCEEDINGS OF THE 17TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, WSDM 2024, 2024, : 1012 - 1021
  • [22] Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework
    Yang, Cheng
    Liu, Jiawei
    Shi, Chuan
    [J]. PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 1227 - 1237
  • [23] Online cross-layer knowledge distillation on graph neural networks with deep supervision
    Guo, Jiongyu
    Chen, Defang
    Wang, Can
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (30): : 22359 - 22374
  • [24] Double Wins: Boosting Accuracy and Fifficiency of Graph Neural Networks by Reliable Knowledge Distillation
    Tan, Qiaoyu
    Zhu, Daochen
    Liu, Ninghao
    Choi, Soo-Hyun
    Li, Li
    Chen, Rui
    Hu, Xia
    [J]. 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 1343 - 1348
  • [25] Graph neural networks
    Corso G.
    Stark H.
    Jegelka S.
    Jaakkola T.
    Barzilay R.
    [J]. Nature Reviews Methods Primers, 4 (1):
  • [26] Graph neural networks
    不详
    [J]. NATURE REVIEWS METHODS PRIMERS, 2024, 4 (01):
  • [27] Graph Neural Networks for Graph Drawing
    Tiezzi, Matteo
    Ciravegna, Gabriele
    Gori, Marco
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4668 - 4681
  • [28] Graph Mining with Graph Neural Networks
    Jin, Wei
    [J]. WSDM '21: PROCEEDINGS OF THE 14TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2021, : 1119 - 1120
  • [29] Graph Clustering with Graph Neural Networks
    Tsitsulin, Anton
    Palowitch, John
    Perozzi, Bryan
    Mueller, Emmanuel
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [30] Combining Concept Graph with Improved Neural Networks for Chinese Short Text Classification
    Liao, Jialu
    Sun, Fanke
    Gu, Jinguang
    [J]. SEMANTIC TECHNOLOGY, JIST 2019, 2020, 1157 : 205 - 212