The effect of network topologies on fully decentralized learning: a preliminary investigation

被引:2
|
作者
Palmieri, Luigi [1 ]
Valerio, Lorenzo [1 ]
Boldrini, Chiara [1 ]
Passarella, Andrea [1 ]
机构
[1] IIT CNR, Pisa, Italy
基金
欧盟地平线“2020”;
关键词
decentralized learning; graph topologies; non-IID data;
D O I
10.1145/3597062.3597280
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In a decentralized machine learning system, data is typically partitioned among multiple devices or nodes, each of which trains a local model using its own data. These local models are then shared and combined to create a global model that can make accurate predictions on new data. In this paper, we start exploring the role of the network topology connecting nodes on the performance of a Machine Learning model trained through direct collaboration between nodes. We investigate how different types of topologies impact the "spreading of knowledge", i.e., the ability of nodes to incorporate in their local model the knowledge derived by learning patterns in data available in other nodes across the networks. Specifically, we highlight the different roles in this process of more or less connected nodes (hubs and leaves), as well as that of macroscopic network properties (primarily, degree distribution and modularity). Among others, we show that, while it is known that even weak connectivity among network components is sufficient for information spread, it may not be sufficient for knowledge spread. More intuitively, we also find that hubs have a more significant role than leaves in spreading knowledge, although this manifests itself not only for heavy-tailed distributions but also when "hubs" have only moderately more connections than leaves. Finally, we show that tightly knit communities severely hinder knowledge spread.
引用
收藏
页码:25 / 30
页数:6
相关论文
共 50 条
  • [41] Fully-Decentralized Multi-Kernel Online Learning over Networks
    Chae, Jeongmin
    Mitra, Urbashi
    Hong, Songnam
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [42] I2Q: A Fully Decentralized Q-Learning Algorithm
    Jiang, Jiechuan
    Lu, Zongqing
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [43] FULLY AUTOMATED NETWORK DESIGN BY DIGITAL COMPUTER - PRELIMINARY CONSIDERATIONS
    ROHRER, RA
    PROCEEDINGS OF THE INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS, 1967, 55 (11): : 1929 - &
  • [44] Interplay of network topologies in aviation delay propagation: A complex network and machine learning analysis
    Li, Qiang
    Wu, Lu
    Guan, Xinjia
    Tian, Ze-jin
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2024, 638
  • [45] The cost of application-level broadcast in a fully decentralized peer-to-peer network
    Portmann, M
    Seneviratne, A
    ISCC 2002: SEVENTH INTERNATIONAL SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS, PROCEEDINGS, 2002, : 941 - 946
  • [46] Integrating association rule mining and decision tree learning for network intrusion detection: A preliminary investigation
    Hossain, M
    6TH WORLD MULTICONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL XI, PROCEEDINGS: COMPUTER SCIENCE II, 2002, : 65 - 70
  • [47] Data-Driven Adaptive Consensus Learning From Network Topologies
    Chi, Ronghu
    Hui, Yu
    Huang, Biao
    Hou, Zhongsheng
    Bu, Xuhui
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (08) : 3487 - 3497
  • [48] Decentralized learning for medical image classification with prototypical contrastive network
    Cao, Zhantao
    Shi, Yuanbing
    Zhang, Shuli
    Chen, Huanan
    Liu, Weide
    Yue, Guanghui
    Lin, Huazhen
    MEDICAL PHYSICS, 2025,
  • [49] Impact of Network Topology on the Convergence of Decentralized Federated Learning Systems
    Kavalionak, Hanna
    Carlini, Emanuele
    Dazzi, Patrizio
    Ferrucci, Luca
    Mordacchini, Matteo
    Coppola, Massimo
    26TH IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS (IEEE ISCC 2021), 2021,
  • [50] Decentralized learning over a network with Nystrom approximation using SGD
    Lian, Heng
    Liu, Jiamin
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2023, 66 : 373 - 387