On Exploiting Network Topology for Hierarchical Coded Multi-Task Learning

被引:1
|
作者
Hu, Haoyang [1 ]
Li, Songze [2 ]
Cheng, Minquan [3 ]
Ma, Shuai [4 ]
Shi, Yuanming [1 ]
Wu, Youlong [1 ]
机构
[1] ShanghaiTech Univ, Sch Informat Sci & Technol, Shanghai 201210, Peoples R China
[2] Southeast Univ, Sch Cyber Sci & Engn, Nanjing 211189, Peoples R China
[3] Guangxi Normal Univ, Guangxi Key Lab Multisource Informat Min & Secur, Guilin 541004, Peoples R China
[4] Peng Cheng Lab, Shenzhen 518055, Peoples R China
基金
上海市自然科学基金;
关键词
Relays; Servers; Task analysis; Distance learning; Computer aided instruction; Downlink; Uplink; Multi-task learning; coding techniques; distributed learning; hierarchical systems; communication load; COMBINATION NETWORKS;
D O I
10.1109/TCOMM.2024.3381671
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Distributed multi-task learning (MTL) is a learning paradigm where distributed users simultaneously learn multiple tasks by leveraging the correlations among tasks. However, distributed MTL suffers from a more severe communication bottleneck than single-task learning as more than one models need to be transmitted in the communication phase. To address this issue, we investigate the hierarchical MTL system where distributed users wish to jointly learn different learning models orchestrated by a central server with the help of multiple relays. We propose a coded distributed computing scheme for hierarchical MTL systems that jointly exploits the network topology and relays' computing capability to create coded multicast opportunities to improve communication efficiency. We theoretically prove that the proposed scheme can significantly reduce the communication loads both in the uplink and downlink transmissions between relays and the server. To further illustrate the optimality of the proposed scheme, we derive information-theoretic lower bounds on the minimum uplink and downlink communication loads and prove that the gaps between achievable upper bounds and lower bounds are within the minimum number of connected users among all relays. In particular, when the network topology can be delicately designed, the proposed scheme can achieve the information-theoretic optimal communication loads. Experiments on real-world datasets show that our proposed scheme can greatly reduce the overall training time compared to the conventional hierarchical MTL scheme.
引用
收藏
页码:4930 / 4944
页数:15
相关论文
共 50 条
  • [1] Coded Distributed Computing for Hierarchical Multi-task Learning
    Hu, Haoyang
    Li, Songze
    Cheng, Minquan
    Wu, Youlong
    [J]. 2023 IEEE INFORMATION THEORY WORKSHOP, ITW, 2023, : 480 - 485
  • [2] Multi-Task Hierarchical Learning Based Network Traffic Analytics
    Barut, Onur
    Luo, Yan
    Zhang, Tong
    Li, Weigang
    Li, Peilong
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021), 2021,
  • [3] Hierarchical Multimodal Fusion Network with Dynamic Multi-task Learning
    Wang, Tianyi
    Chen, Shu-Ching
    [J]. 2021 IEEE 22ND INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION FOR DATA SCIENCE (IRI 2021), 2021, : 208 - 214
  • [4] Hierarchical Prompt Learning for Multi-Task Learning
    Liu, Yajing
    Lu, Yuning
    Liu, Hao
    An, Yaozu
    Xu, Zhuoran
    Yao, Zhuokun
    Zhang, Baofeng
    Xiong, Zhiwei
    Gui, Chenguang
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 10888 - 10898
  • [5] Gated hierarchical multi-task learning network for judicial decision prediction
    Yao, Fanglong
    Sun, Xian
    Yu, Hongfeng
    Yang, Yang
    Zhang, Wenkai
    Fu, Kun
    [J]. NEUROCOMPUTING, 2020, 411 : 313 - 326
  • [6] Task Switching Network for Multi-task Learning
    Sun, Guolei
    Probst, Thomas
    Paudel, Danda Pani
    Popovic, Nikola
    Kanakis, Menelaos
    Patel, Jagruti
    Dai, Dengxin
    Van Gool, Luc
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 8271 - 8280
  • [7] HFedMTL: Hierarchical Federated Multi-Task Learning
    Yi, Xingfu
    Li, Rongpeng
    Peng, Chenghui
    Wu, Jianjun
    Zhao, Zhifeng
    [J]. 2022 IEEE 33RD ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (IEEE PIMRC), 2022,
  • [8] Multi-Task Network Representation Learning
    Xie, Yu
    Jin, Peixuan
    Gong, Maoguo
    Zhang, Chen
    Yu, Bin
    [J]. FRONTIERS IN NEUROSCIENCE, 2020, 14
  • [9] Compressed Hierarchical Representations for Multi-Task Learning and Task Clustering
    de Freitas, Joao Machado
    Berg, Sebastian
    Geiger, Bernhard C.
    Muecke, Manfred
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [10] Hierarchical Inter-Attention Network for Document Classification with Multi-Task Learning
    Tian, Bing
    Zhang, Yong
    Wang, Jin
    Xing, Chunxiao
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3569 - 3575