Leiden-Fusion Partitioning Method for Effective Distributed Training of Graph Embeddings

被引:0
|
作者
Bai, Yuhe [1 ]
Constantin, Camelia [1 ]
Naacke, Hubert [1 ]
机构
[1] Sorbonne Univ, LIP6, Paris, France
关键词
Distributed Training; Graph Embeddings; Graph Partitioning;
D O I
10.1007/978-3-031-70368-3_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the area of large-scale training of graph embeddings, effective training frameworks and partitioning methods are critical for handling large networks. However, they face two major challenges: 1) existing synchronized distributed frameworks require continuous communication to access information from other machines, and 2) the inability of current partitioning methods to ensure that subgraphs remain connected components without isolated nodes, which is essential for effective training of GNNs since training relies on information aggregation from neighboring nodes. To address these issues, we introduce a novel partitioning method, named Leiden-Fusion, designed for large-scale training of graphs with minimal communication. Our method extends the Leiden community detection algorithm with a greedy algorithm that merges the smallest communities with highly connected neighboring communities. Our method guarantees that, for an initially connected graph, each partition is a densely connected subgraph with no isolated nodes. After obtaining the partitions, we train a GNN for each partition independently, and finally integrate all embeddings for node classification tasks, which significantly reduces the need for network communication and enhances the efficiency of distributed graph training. We demonstrate the effectiveness of our method through extensive evaluations on several benchmark datasets, achieving high efficiency while preserving the quality of the graph embeddings for node classification tasks.
引用
收藏
页码:366 / 382
页数:17
相关论文
共 6 条
  • [1] A Semantic Partitioning Method for Large-Scale Training of Knowledge Graph Embeddings
    Bai, Yuhe
    Naacke, Hubert
    Constantin, Camelia
    COMPANION OF THE WORLD WIDE WEB CONFERENCE, WWW 2023, 2023, : 573 - 577
  • [2] Distributed Training of Embeddings using Graph Analytics
    Gill, Gurbinder
    Dathathri, Roshan
    Maleki, Saeed
    Musuvathi, Madan
    Mytkowicz, Todd
    Saarikivi, Olli
    2021 IEEE 35TH INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM (IPDPS), 2021, : 973 - 983
  • [3] Attribute-driven streaming edge partitioning with reconciliations for distributed graph neural network training
    Mu, Zongshen
    Tang, Siliang
    Zhuang, Yueting
    Yu, Dianhai
    NEURAL NETWORKS, 2023, 165 : 987 - 998
  • [4] A computational-graph partitioning method for training memory-constrained DNNs
    Qararyah, Fareed
    Wahib, Mohamed
    Dikbayir, Doga
    Belviranli, Mehmet Esat
    Unat, Didem
    PARALLEL COMPUTING, 2021, 104
  • [5] Minimum degree reordering based graph partitioning method for distributed fault section estimation system in power networks
    Bi, T
    Ni, YX
    Wu, FF
    Yang, QX
    2001 IEEE/PES TRANSMISSION AND DISTRIBUTION CONFERENCE AND EXPOSITION, VOLS 1 AND 2: DEVELOPING NEW PERSPECTIVES, 2001, : 212 - 216
  • [6] Sketch-fusion: A gradient compression method with multi-layer fusion for communication-efficient distributed training
    Dai, Lingfei
    Gong, Luqi
    An, Zhulin
    Xu, Yongjun
    Diao, Boyu
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2024, 185