Graph Invariant Learning with Subgraph Co-mixup for Out-of-Distribution Generalization

被引:0
|
作者
Jia, Tianrui [1 ]
Li, Haoyang [2 ]
Yang, Cheng [1 ]
Tao, Tao [3 ]
Shi, Chuan [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Beijing 1, Peoples R China
[2] Tsinghua Univ, Beijing, Peoples R China
[3] China Mobile Informat Technol Co Ltd, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have been demonstrated to perform well in graph representation learning, but always lacking in generalization capability when tackling out-of-distribution (OOD) data. Graph invariant learning methods, backed by the invariance principle among defined multiple environments, have shown effectiveness in dealing with this issue. However, existing methods heavily rely on well-predefined or accurately generated environment partitions, which are hard to be obtained in practice, leading to sub-optimal OOD generalization performances. In this paper, we propose a novel graph invariant learning method based on invariant and variant patterns comixup strategy, which is capable of jointly generating mixed multiple environments and capturing invariant patterns from the mixed graph data. Specifically, we first adopt a subgraph extractor to identify invariant subgraphs. Subsequently, we design one novel co-mixup strategy, i.e., jointly conducting environment mixup and invariant mixup. For the environment mixup, we mix the variant environment-related subgraphs so as to generate sufficiently diverse multiple environments, which is important to guarantee the quality of the graph invariant learning. For the invariant mixup, we mix the invariant subgraphs, further encouraging to capture invariant patterns behind graphs while getting rid of spurious correlations for OOD generalization. We demonstrate that the proposed environment mixup and invariant mixup can mutually promote each other. Extensive experiments on both synthetic and real-world datasets demonstrate that our method significantly outperforms state-of-the-art under various distribution shifts.
引用
收藏
页码:8562 / 8570
页数:9
相关论文
共 50 条
  • [41] On the Out-of-distribution Generalization of Probabilistic Image Modelling
    Zhang, Mingtian
    Zhang, Andi
    McDonagh, Steven
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [42] Out-of-distribution Generalization and Its Applications for Multimedia
    Wang, Xin
    Cui, Peng
    Zhu, Wenwu
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 5681 - 5682
  • [43] Out-of-Distribution Generalization With Causal Feature Separation
    Wang, Haotian
    Kuang, Kun
    Lan, Long
    Wang, Zige
    Huang, Wanrong
    Wu, Fei
    Yang, Wenjing
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (04) : 1758 - 1772
  • [44] A Stable Vision Transformer for Out-of-Distribution Generalization
    Yu, Haoran
    Liu, Baodi
    Wang, Yingjie
    Zhang, Kai
    Tao, Dapeng
    Liu, Weifeng
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 328 - 339
  • [45] Diverse Weight Averaging for Out-of-Distribution Generalization
    Rame, Alexandre
    Kirchmeyer, Matthieu
    Rahier, Thibaud
    Rakotomamonjy, Alain
    Gallinari, Patrick
    Cord, Matthieu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [46] Out-of-Distribution Generalization via Risk Extrapolation
    Krueger, David
    Caballero, Ethan
    Jacobsen, Joern-Henrik
    Zhang, Amy
    Binas, Jonathan
    Zhang, Dinghuai
    Le Priol, Remi
    Courville, Aaron
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [47] Towards a Theoretical Framework of Out-of-Distribution Generalization
    Ye, Haotian
    Xie, Chuanlong
    Cai, Tianle
    Li, Ruichen
    Li, Zhenguo
    Wang, Liwei
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [48] A two-stage co-adversarial perturbation to mitigate out-of-distribution generalization of large-scale graph
    Wang, Yili
    Xue, Haotian
    Wang, Xin
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 255
  • [49] Unifying invariant and variant features for graph out-of-distribution via probability of necessity and sufficiency
    Chen, Xuexin
    Cai, Ruichu
    Zheng, Kaitao
    Jiang, Zhifan
    Huang, Zhengting
    Hao, Zhifeng
    Li, Zijian
    NEURAL NETWORKS, 2025, 184
  • [50] Learning Numerosity Representations with Transformers: Number Generation Tasks and Out-of-Distribution Generalization
    Boccato, Tommaso
    Testolin, Alberto
    Zorzi, Marco
    ENTROPY, 2021, 23 (07)