Decentralized Stochastic Optimization With Pairwise Constraints and Variance Reduction

被引:0
|
作者
Han, Fei [1 ]
Cao, Xuanyu [1 ]
Gong, Yi [2 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Clear Water Bay, Hong Kong, Peoples R China
[2] Southern Univ Sci & Technol, Univ Key Lab Adv Wireless Commun Guangdong Prov, Dept Elect & Elect Engn, Shenzhen 518055, Peoples R China
基金
中国国家自然科学基金;
关键词
Stochastic optimization; distributed optimization; constrained optimization; variance reduction; ALGORITHM; CONVERGENCE; CONSENSUS;
D O I
10.1109/TSP.2024.3374082
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper focuses on minimizing the decentralized finite-sum optimization over a network, where each pair of neighboring agents is associated with a nonlinear proximity constraint. Additionally, each agent possesses a private convex cost that can be decomposed into an average of multiple constituent functions. The goal of the network is to collectively minimize the sum of individual costs while satisfying all constraints. Due to their fast convergence and low computational burden, stochastic variance reduction methods have primarily been studied for finite-sum minimization problems. However, these algorithms did not consider the constrained optimization problems. To bridge this gap, we propose a decentralized stochastic algorithmic framework called VQ-VR. This framework extends the virtual-queue-based algorithm introduced in [1] to stochastic settings for constrained optimization instead of using the classical saddle point method. VQ-VR operates between stochastic variance-reduced gradient descent steps and virtual queue updates. Furthermore, we describe and analyze two specific instantiations of this framework, namely VQ-SVRG and VQ-SAGA. Our convex analysis relies on the drift of a novel quadratic Lyapunov function. We prove that VQ-SVRG and VQ-SAGA both achieve the sublinear convergence rate of O(1/K) in terms of expected cost suboptimality and constraint violations for smooth and general convex problems, where K is the number of iterations. To the best of our knowledge, VQ-VR is the first stochastic algorithm capable of solving decentralized nonlinear constrained optimization problems with a convergence rate of O(1/K). Additionally, we present numerical results on two specific applications: decentralized QCQP and decentralized logistic regression. These results verify the theoretical results and demonstrate that, on a per-gradient-evaluation basis, our algorithms achieve a relative cost gap improvement of more than 7dB compared to existing methods.
引用
收藏
页码:1960 / 1973
页数:14
相关论文
共 50 条
  • [11] Decentralized Stochastic Optimization and Machine Learning: A Unified Variance-Reduction Framework for Robust Performance and Fast Convergence
    Xin, Ran
    Kar, Soummya
    Khan, Usman A.
    IEEE SIGNAL PROCESSING MAGAZINE, 2020, 37 (03) : 102 - 113
  • [12] Accelerated proximal stochastic variance reduction for DC optimization
    Lulu He
    Jimin Ye
    Jianwei E
    Neural Computing and Applications, 2021, 33 : 13163 - 13181
  • [13] Asynchronous Stochastic Proximal Optimization Algorithms with Variance Reduction
    Meng, Qi
    Chen, Wei
    Yu, Jingcheng
    Wang, Taifeng
    Ma, Zhi-Ming
    Liu, Tie-Yan
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2329 - 2335
  • [14] Parallel Asynchronous Stochastic Variance Reduction for Nonconvex Optimization
    Fang, Cong
    Lin, Zhouchen
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 794 - 800
  • [15] Accelerated proximal stochastic variance reduction for DC optimization
    He, Lulu
    Ye, Jimin
    Jianwei, E.
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (20): : 13163 - 13181
  • [16] A projected decentralized variance-reduction algorithm for constrained optimization problems
    Deng, Shaojiang
    Gao, Shanfu
    Lu, Qingguo
    Li, Yantao
    Li, Huaqing
    NEURAL COMPUTING & APPLICATIONS, 2023, 36 (2): : 913 - 928
  • [17] A projected decentralized variance-reduction algorithm for constrained optimization problems
    Shaojiang Deng
    Shanfu Gao
    Qingguo Lü
    Yantao Li
    Huaqing Li
    Neural Computing and Applications, 2024, 36 : 913 - 928
  • [18] A DECENTRALIZED VARIANCE-REDUCED METHOD FOR STOCHASTIC OPTIMIZATION OVER DIRECTED GRAPHS
    Qureshi, Muhammad, I
    Xin, Ran
    Kar, Soummya
    Khan, Usman A.
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 5030 - 5034
  • [19] Accelerated Stochastic Variance Reduction for a Class of Convex Optimization Problems
    He, Lulu
    Ye, Jimin
    Jianwei, E.
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2023, 196 (03) : 810 - 828
  • [20] On variance reduction for stochastic smooth convex optimization with multiplicative noise
    Alejandro Jofré
    Philip Thompson
    Mathematical Programming, 2019, 174 : 253 - 292