Numerical methods for distributed stochastic compositional optimization problems with aggregative structure

被引:0
|
作者
Zhao, Shengchao [1 ]
Liu, Yongchao [2 ]
机构
[1] China Univ Min & Technol, Sch Math, Xuzhou, Peoples R China
[2] Dalian Univ Technol, Sch Math Sci, Dalian, Peoples R China
关键词
Distributed stochastic compositional optimization; aggregative structure; hybrid variance reduction technique; dynamic consensus mechanism; communication compression; GRADIENT DESCENT; ALGORITHMS;
D O I
10.1080/10556788.2024.2381214
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
The paper studies the distributed stochastic compositional optimization problems over networks, where all the agents' inner-level function is the sum of each agent's private expectation function. Focusing on the aggregative structure of the inner-level function, we employ the hybrid variance reduction method to obtain the information on each agent's private expectation function, and apply the dynamic consensus mechanism to track the information on each agent's inner-level function. Then by combining with the standard distributed stochastic gradient descent method, we propose a distributed aggregative stochastic compositional gradient descent method. When the objective function is smooth, the proposed method achieves the convergence rate $ \mathcal {O}(K<^>{-1/2}) $ O(K-1/2). We further combine the proposed method with the communication compression and propose the communication compressed variant distributed aggregative stochastic compositional gradient descent method. The compressed variant of the proposed method maintains the convergence rate $ \mathcal {O}(K<^>{-1/2}) $ O(K-1/2). Simulated experiments on decentralized reinforcement learning verify the effectiveness of the proposed methods.
引用
收藏
页数:32
相关论文
共 50 条
  • [1] Fast Training Methods for Stochastic Compositional Optimization Problems
    Gao, Hongchang
    Huang, Heng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] Distributed stochastic compositional optimization problems over directed networks
    Zhao, Shengchao
    Liu, Yongchao
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 87 (01) : 249 - 288
  • [3] Distributed stochastic compositional optimization problems over directed networks
    Shengchao Zhao
    Yongchao Liu
    [J]. Computational Optimization and Applications, 2024, 87 : 249 - 288
  • [4] On the numerical solution of stochastic optimization problems
    Mayer, J
    [J]. SYSTEM MODELING AND OPTIMIZATION, 2006, 199 : 193 - 206
  • [5] DISTRIBUTED AGGREGATIVE OPTIMIZATION WITH QUANTIZED COMMUNICATION
    Chen, Ziqin
    Liang, Shu
    [J]. KYBERNETIKA, 2022, 58 (01) : 123 - 144
  • [6] Faster Stochastic Variance Reduction Methods for Compositional MiniMax Optimization
    Liu, Jin
    Pan, Xiaokang
    Duan, Junwen
    Li, Hong-Dong
    Li, Youqi
    Qu, Zhe
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 12, 2024, : 13927 - 13935
  • [7] Differentially private distributed algorithms for stochastic aggregative games
    Wang, Jimin
    Zhang, Ji-Feng
    He, Xingkang
    [J]. AUTOMATICA, 2022, 142
  • [8] Aggregative feedback optimization for distributed cooperative robotics
    Carnevale, Guido
    Mimmo, Nicola
    Notarstefano, Giuseppe
    [J]. IFAC PAPERSONLINE, 2022, 55 (13): : 7 - 12
  • [9] On the sensitivity of aggregative multiobjective optimization methods
    Technocentre Renault, 68240, 1 Av. du Golf, Guyancourt
    78288, France
    不详
    94010, France
    [J]. J. Compt. Inf. Technol., 2008, 1 (1-13):
  • [10] DISTRIBUTED COMPUTING OF A STOCHASTIC ALGORITHM FOR COMBINATORIAL OPTIMIZATION PROBLEMS
    ZHAO, Y
    FUKAO, T
    [J]. LECTURE NOTES IN CONTROL AND INFORMATION SCIENCES, 1988, 113 : 318 - 327