Stochastic Subgradient Algorithms for Strongly Convex Optimization Over Distributed Networks

被引:37
|
作者
Sayin, Muhammed O. [1 ]
Vanli, N. Denizcan [2 ]
Kozat, Suleyman S. [3 ]
Basar, Tamer [1 ]
机构
[1] Univ Illinois Urbana Champaign UIUC, Coordinated Sci Lab, Urbana, IL 61801 USA
[2] MIT, Lab Informat & Decis Syst, 77 Massachusetts Ave, Cambridge, MA 02139 USA
[3] Bilkent Univ, Dept Elect & Elect Engn, TR-06800 Ankara, Turkey
关键词
Distributed processing; convex optimization; online learning; diffusion strategies; consensus strategies; ADAPTIVE NETWORKS; DIFFUSION ADAPTATION; STRATEGIES; PERFORMANCE; SQUARES; LMS;
D O I
10.1109/TNSE.2017.2713396
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
We study diffusion and consensus based optimization of a sum of unknown convex objective functions over distributed networks. The only access to these functions is through stochastic gradient oracles, each of which is only available at a different node; and a limited number of gradient oracle calls is allowed at each node. In this framework, we introduce a convex optimization algorithm based on stochastic subgradient descent (SSD) updates. We use a carefully designed time-dependent weighted averaging of the SSD iterates, which yields a convergence rate of O(N root N/(1-sigma)T) after T gradient updates for each node on a network of N nodes, where 0 <= s < 1 denotes the second largest singular value of the communication matrix. This rate of convergence matches the performance lower bound up to constant terms. Similar to the SSD algorithm, the computational complexity of the proposed algorithm also scales linearly with the dimensionality of the data. Furthermore, the communication load of the proposed method is the same as the communication load of the SSD algorithm. Thus, the proposed algorithm is highly efficient in terms of complexity and communication load. We illustrate the merits of the algorithm with respect to the state-of-art methods over benchmark real life data sets.
引用
收藏
页码:248 / 260
页数:13
相关论文
共 50 条
  • [21] Accelerated Primal-Dual Algorithms for Distributed Smooth Convex Optimization over Networks
    Xu, Jinming
    Tian, Ye
    Sun, Ying
    Scutari, Gesualdo
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [22] Distributed Stochastic Optimization with Compression for Non-Strongly Convex Objectives
    Li, Xuanjie
    Xu, Yuedong
    [J]. CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2024, 139 (01): : 459 - 481
  • [23] Privacy Preservation in Distributed Subgradient Optimization Algorithms
    Lou, Youcheng
    Yu, Lean
    Wang, Shouyang
    Yi, Peng
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2018, 48 (07) : 2154 - 2165
  • [24] Distributed Dual Subgradient Algorithms With Iterate-Averaging Feedback for Convex Optimization With Coupled Constraints
    Liang, Shu
    Wang, Le Yi
    Yin, George
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (05) : 2529 - 2539
  • [25] Almost sure convergence of random projected proximal and subgradient algorithms for distributed nonsmooth convex optimization
    Iiduka, Hideaki
    [J]. OPTIMIZATION, 2017, 66 (01) : 35 - 59
  • [26] Gossip Algorithms for Convex Consensus Optimization Over Networks
    Lu, Jie
    Tang, Choon Yik
    Regier, Paul R.
    Bow, Travis D.
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2011, 56 (12) : 2911 - 2918
  • [27] OPTIMAL STOCHASTIC APPROXIMATION ALGORITHMS FOR STRONGLY CONVEX STOCHASTIC COMPOSITE OPTIMIZATION, II: SHRINKING PROCEDURES AND OPTIMAL ALGORITHMS
    Ghadimi, Saeed
    Lan, Guanghui
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2013, 23 (04) : 2061 - 2089
  • [28] Distributed Stochastic Algorithm for Convex Optimization Over Directed Graphs
    Cheng, Songsong
    Liang, Shu
    Hong, Yiguang
    [J]. PROCEEDINGS OF THE 2019 31ST CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2019), 2019, : 101 - 106
  • [29] Distributed event-triggered algorithms for a class of convex optimization problems over directed networks
    Dai, Hao
    Fang, Xinpeng
    Chen, Weisheng
    [J]. AUTOMATICA, 2020, 122
  • [30] Constrained Consensus Algorithms With Fixed Step Size for Distributed Convex Optimization Over Multiagent Networks
    Liu, Qingshan
    Yang, Shaofu
    Hong, Yiguang
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2017, 62 (08) : 4259 - 4265