Graph Invariant Learning with Subgraph Co-mixup for Out-of-Distribution Generalization

被引:0
|
作者
Jia, Tianrui [1 ]
Li, Haoyang [2 ]
Yang, Cheng [1 ]
Tao, Tao [3 ]
Shi, Chuan [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Beijing 1, Peoples R China
[2] Tsinghua Univ, Beijing, Peoples R China
[3] China Mobile Informat Technol Co Ltd, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have been demonstrated to perform well in graph representation learning, but always lacking in generalization capability when tackling out-of-distribution (OOD) data. Graph invariant learning methods, backed by the invariance principle among defined multiple environments, have shown effectiveness in dealing with this issue. However, existing methods heavily rely on well-predefined or accurately generated environment partitions, which are hard to be obtained in practice, leading to sub-optimal OOD generalization performances. In this paper, we propose a novel graph invariant learning method based on invariant and variant patterns comixup strategy, which is capable of jointly generating mixed multiple environments and capturing invariant patterns from the mixed graph data. Specifically, we first adopt a subgraph extractor to identify invariant subgraphs. Subsequently, we design one novel co-mixup strategy, i.e., jointly conducting environment mixup and invariant mixup. For the environment mixup, we mix the variant environment-related subgraphs so as to generate sufficiently diverse multiple environments, which is important to guarantee the quality of the graph invariant learning. For the invariant mixup, we mix the invariant subgraphs, further encouraging to capture invariant patterns behind graphs while getting rid of spurious correlations for OOD generalization. We demonstrate that the proposed environment mixup and invariant mixup can mutually promote each other. Extensive experiments on both synthetic and real-world datasets demonstrate that our method significantly outperforms state-of-the-art under various distribution shifts.
引用
收藏
页码:8562 / 8570
页数:9
相关论文
共 50 条
  • [1] Learning Invariant Graph Representations for Out-of-Distribution Generalization
    Li, Haoyang
    Zhang, Ziwei
    Wang, Xin
    Zhu, Wenwu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [2] DIVE: Subgraph Disagreement for Graph Out-of-Distribution Generalization
    Sun, Xin
    Wang, Liang
    Liu, Qiang
    Wu, Shu
    Wang, Zilei
    Wang, Liang
    PROCEEDINGS OF THE 30TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2024, 2024, : 2794 - 2805
  • [3] Learning Causally Invariant Representations for Out-of-Distribution Generalization on Graphs
    Chen, Yongqiang
    Zhang, Yonggang
    Bian, Yatao
    Yang, Han
    Ma, Kaili
    Xie, Binghui
    Liu, Tongliang
    Han, Bo
    Cheng, James
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [4] Graph out-of-distribution generalization through contrastive learning paradigm
    Du, Hongyi
    Li, Xuewei
    Shao, Minglai
    KNOWLEDGE-BASED SYSTEMS, 2025, 315
  • [5] Out-of-distribution Generalization with Causal Invariant Transformations
    Wang, Ruoyu
    Yi, Mingyang
    Chen, Zhitang
    Zhu, Shengyu
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 375 - 385
  • [6] On the Connection between Invariant Learning and Adversarial Training for Out-of-Distribution Generalization
    Xin, Shiji
    Wang, Yifei
    Su, Jingtong
    Wang, Yisen
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 9, 2023, : 10519 - 10527
  • [7] FLOOD: A Flexible Invariant Learning Framework for Out-of-Distribution Generalization on Graphs
    Liu, Yang
    Ao, Xiang
    Feng, Fuli
    Ma, Yunshan
    Li, Kuan
    Chua, Tat-Seng
    He, Qing
    PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 1548 - 1558
  • [8] Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization
    Yuan, Haonan
    Sun, Qingyun
    Fu, Xingcheng
    Zhang, Ziwei
    Ji, Cheng
    Peng, Hao
    Li, Jianxin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] Negative as Positive: Enhancing Out-of-distribution Generalization for Graph Contrastive Learning
    Wang, Zixu
    Xu, Bingbing
    Yuan, Yige
    Shen, Huawei
    Cheng, Xueqi
    PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 2548 - 2552
  • [10] Discovering causally invariant features for out-of-distribution generalization
    Wang, Yujie
    Yu, Kui
    Xiang, Guodu
    Cao, Fuyuan
    Liang, Jiye
    PATTERN RECOGNITION, 2024, 150