Understanding Dropout for Graph Neural Networks

被引:4
|
作者
Shu, Juan [1 ]
Xi, Bowei [1 ]
Li, Yu [2 ]
Wu, Fan [1 ]
Kamhoua, Charles [3 ]
Ma, Jianzhu [4 ]
机构
[1] Purdue Univ, Dept Stat, W Lafayette, IN 47907 USA
[2] Chinese Univ Hong Kong, Comp Sci & Engn, Hong Kong, Peoples R China
[3] US Army Res Lab, Adelphi, MD USA
[4] Peking Univ, Inst Artificial Intelligence, Beijing, Peoples R China
关键词
Graph neural network; dropout; over-smoothing;
D O I
10.1145/3487553.3524725
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural network (GNN) has demonstrated superior performance on graph learning tasks. GNN captures the data dependencies via message passing amid neural networks. Hence the prediction of a node label can utilize information from its neighbors in a graph. Dropout is a regularization as well as an ensemble method for convolutional neural network (CNN), which has been carefully studied. However, there are few existing works that focused on dropout schemes for GNN. Although GNN and CNN share similar model architecture, both with convolutional layers and fully connected layers, the input data structure for GNN and CNN are different and convolution operation differs. This suggests the dropout schemes for CNN should not be directly applied to GNN without a good understanding of the impact. In this paper, we divide the existing dropout schemes for GNN into two categories: (1) dropout on feature maps and (2) dropout on graph structure. Based on the drawbacks of current GNN dropout models, we propose a novel layer compensation dropout and a novel adaptive heteroscadestic Gaussian dropout, which can be applied to any type of GNN models and outperforms their corresponding baselines in shallow GNNs. Then an experimental study shows Bernoulli dropout generalize better while Gaussian dropout is slightly stronger in transductive performance. At last, we theoretically study how different dropout schemes mitigate over-smoothing problems and experimental results shows that layer compensation dropout allows a GNN model to maintain or slightly improve its performance as the GNN model adds more layers while all the other dropout models suffer from performance degradation when GNN goes deep.
引用
收藏
页码:1128 / 1138
页数:11
相关论文
共 50 条
  • [1] Understanding Pooling in Graph Neural Networks
    Grattarola, Daniele
    Zambon, Daniele
    Bianchi, Filippo Maria
    Alippi, Cesare
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2708 - 2718
  • [2] Understanding Attention and Generalization in Graph Neural Networks
    Knyazev, Boris
    Taylor, Graham W.
    Amer, Mohamed R.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [3] Understanding the Representation Power of Graph Neural Networks in Learning Graph Topology
    Dehmamy, Nima
    Barabasi, Albert-Laszlo
    Yu, Rose
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [4] Concept Graph Neural Networks for Surgical Video Understanding
    Ban, Yutong
    Eckhoff, Jennifer A.
    Ward, Thomas M.
    Hashimoto, Daniel A.
    Meireles, Ozanan R.
    Rus, Daniela
    Rosman, Guy
    [J]. IEEE TRANSACTIONS ON MEDICAL IMAGING, 2024, 43 (01) : 264 - 274
  • [5] GUIDE: Training Deep Graph Neural Networks via Guided Dropout Over Edges
    Wang, Jie
    Liang, Jianqing
    Liang, Jiye
    Yao, Kaixuan
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4465 - 4477
  • [6] Understanding Graph Neural Networks with Generalized Geometric Scattering Transforms
    Perlmutter, Michael
    Tong, Alexander
    Gao, Feng
    Wolf, Guy
    Hirn, Matthew
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2023, 5 (04): : 873 - 898
  • [7] Simplified multilayer graph convolutional networks with dropout
    Yang, Fei
    Zhang, Huyin
    Tao, Shiming
    [J]. APPLIED INTELLIGENCE, 2022, 52 (05) : 4776 - 4791
  • [8] Simplified multilayer graph convolutional networks with dropout
    Fei Yang
    Huyin Zhang
    Shiming Tao
    [J]. Applied Intelligence, 2022, 52 : 4776 - 4791
  • [9] Universal Approximation in Dropout Neural Networks
    Manita, Oxana A.
    Peletier, Mark A.
    Portegies, Jacobus W.
    Sanders, Jaron
    Senen-Cerda, Albert
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [10] Selective Dropout for Deep Neural Networks
    Barrow, Erik
    Eastwood, Mark
    Jayne, Chrisina
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2016, PT III, 2016, 9949 : 519 - 528