Fine-Tuning Graph Neural Networks by Preserving Graph Generative Patterns

被引:0
|
作者
Sun, Yifei [1 ]
Zhu, Qi [2 ]
Yang, Yang [1 ]
Wang, Chunping [3 ]
Fan, Tianyu [1 ]
Zhu, Jiajun [1 ]
Chen, Lei [3 ]
机构
[1] Zhejiang Univ, Hangzhou, Peoples R China
[2] Univ Illinois, Urbana, IL USA
[3] FinVolut Grp, Shanghai, Peoples R China
关键词
CONVERGENT SEQUENCES;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, the paradigm of pre-training and fine-tuning graph neural networks has been intensively studied and applied in a wide range of graph mining tasks. Its success is generally attributed to the structural consistency between pre-training and downstream datasets, which, however, does not hold in many real-world scenarios. Existing works have shown that the structural divergence between pre-training and downstream graphs significantly limits the transferability when using the vanilla fine-tuning strategy. This divergence leads to model overfitting on pre-training graphs and causes difficulties in capturing the structural properties of the downstream graphs. In this paper, we identify the fundamental cause of structural divergence as the discrepancy of generative patterns between the pretraining and downstream graphs. Furthermore, we propose G- TUNING to preserve the generative patterns of downstream graphs. Given a downstream graph G, the core idea is to tune the pre-trained GNN so that it can reconstruct the generative patterns of G, the graphon W. However, the exact reconstruction of a graphon is known to be computationally expensive. To overcome this challenge, we provide a theoretical analysis that establishes the existence of a set of alternative graphons called graphon bases for any given graphon. By utilizing a linear combination of these graphon bases, we can efficiently approximate W. This theoretical finding forms the basis of our model, as it enables effective learning of the graphon bases and their associated coefficients. Compared with existing algorithms, G-TUNING demonstrates consistent performance improvement in 7 in-domain and 7 out-of-domain transfer learning experiments.
引用
收藏
页码:9053 / 9061
页数:9
相关论文
共 50 条
  • [1] Fine-Tuning Graph Neural Networks via Graph Topology Induced Optimal Transport
    Zhang, Jiying
    Xiao, Xi
    Huang, Long-Kai
    Rong, Yu
    Bian, Yatao
    [J]. IJCAI International Joint Conference on Artificial Intelligence, 2022, : 3730 - 3736
  • [2] Combining Large Model Fine-Tuning and Graph Neural Networks for Knowledge Graph Question Answering
    Chen, Junzhen
    Wang, Shuying
    Luo, Haoran
    [J]. Computer Engineering and Applications, 2024, 60 (24) : 166 - 176
  • [3] Drop edges and adapt: A fairness enforcing fine-tuning for graph neural networks
    Spinelli, Indro
    Bianchini, Riccardo
    Scardapane, Simone
    [J]. NEURAL NETWORKS, 2023, 167 : 159 - 167
  • [4] Measuring Task Similarity and Its Implication in Fine-Tuning Graph Neural Networks
    Huang, Renhong
    Xu, Jiarong
    Jiang, Xin
    Pan, Chenglu
    Yang, Zhiming
    Wang, Chunping
    Yang, Yang
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 11, 2024, : 12617 - 12625
  • [5] Drop Edges and Adapt: A Fairness Enforcing Fine-Tuning for Graph Neural Networks
    Spinelli, Indro
    Bianchini, Riccardo
    Scardapane, Simone
    [J]. SSRN, 2023,
  • [6] Voucher Abuse Detection with Prompt-based Fine-tuning on Graph Neural Networks
    Wen, Zhihao
    Fang, Yuan
    Liu, Yihan
    Guo, Yang
    Hao, Shuji
    [J]. PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 4864 - 4870
  • [7] Fine-Tuning Neural Patient Question Retrieval Model with Generative Adversarial Networks
    Tang, Guoyu
    Ni, Yuan
    Wang, Keqiang
    Yong, Qin
    [J]. BUILDING CONTINENTS OF KNOWLEDGE IN OCEANS OF DATA: THE FUTURE OF CO-CREATED EHEALTH, 2018, 247 : 720 - 724
  • [8] Knowledge Graph Fusion for Language Model Fine-Tuning
    Bhana, Nimesh
    van Zyl, Terence L.
    [J]. 2022 9TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE, ISCMI, 2022, : 167 - 172
  • [9] Deep Generative Probabilistic Graph Neural Networks for Scene Graph Generation
    Khademi, Mahmoud
    Schulte, Oliver
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 11237 - 11245
  • [10] Fine-Tuning and the Stability of Recurrent Neural Networks
    MacNeil, David
    Eliasmith, Chris
    [J]. PLOS ONE, 2011, 6 (09):