Wireless Network Optimization via Stochastic Sub-gradient Descent: Rate Analysis

被引:0
|
作者
Bedi, Amrit Singh [1 ]
Rajawat, Ketan [1 ]
机构
[1] Indian Inst Technol, Dept Elect Engn, Kanpur 208016, Uttar Pradesh, India
关键词
RESOURCE-ALLOCATION; ALGORITHMS; CONVERGENCE;
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
This paper considers a general stochastic resource allocation problem that arises widely in wireless networks, cognitive radio networks, smart-grid communications, and cross-layer design. The problem formulation involves expectations with respect to a collection of random variables with unknown distributions, representing exogenous quantities such as channel gain, user density, or spectrum occupancy. The problem is solved in dual domain using a constant step-size stochastic dual subgradient descent (SDSD) method. This results in a primal resource allocation subproblem at each time instant. The goal here is to characterize the non-asymptotic behavior of such stochastic resource allocations in an almost sure sense. This paper establishes a convergence rate result for the SDSD algorithm that precisely characterizes the trade-off between the rate of convergence and the choice of constant step size epsilon. Towards this end, a novel stochastic bound on the gap between the objective function and the optimum is developed. The asymptotic behavior of the stochastic term is characterized in an almost sure sense, thereby generalizing the existing results for the stochastic subgradient methods. As an application, the power and user-allocation problem in device-to-device networks is formulated and solved using the SDSI) algorithm. Further intuition on the rate results is obtained from the verification of the regularity conditions and accompanying simulation results.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Stochastic sub-gradient algorithm for distributed optimization with random sleep scheme
    Yi P.
    Hong Y.
    [J]. Control Theory and Technology, 2015, 13 (04) : 333 - 347
  • [2] Distributed Heterogeneous Multi-Agent Optimization with Stochastic Sub-Gradient
    Hu, Haokun
    Mo, Lipo
    Cao, Xianbing
    [J]. Journal of Systems Science and Complexity, 2024, 37 (04) : 1470 - 1487
  • [3] Distributed Heterogeneous Multi-Agent Optimization with Stochastic Sub-Gradient
    HU Haokun
    MO Lipo
    CAO Xianbing
    [J]. Journal of Systems Science & Complexity, 2024, 37 (04) : 1470 - 1487
  • [4] Regularization in Network Optimization via Trimmed Stochastic Gradient Descent With Noisy Label
    Nakamura, Kensuke
    Sohn, Bong-Soo
    Won, Kyoung-Jae
    Hong, Byung-Woo
    [J]. IEEE ACCESS, 2022, 10 : 34706 - 34715
  • [5] Decentralized Asynchronous Stochastic Gradient Descent: Convergence Rate Analysis
    Bedi, Amrit Singh
    Pradhan, Hrusikesha
    Rajawat, Ketan
    [J]. 2018 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS (SPCOM 2018), 2018, : 402 - 406
  • [6] Bound Analysis of Natural Gradient Descent in Stochastic Optimization Setting
    Luo, Zhijian
    Liao, Danping
    Qian, Yuntao
    [J]. 2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 4166 - 4171
  • [7] Strong error analysis for stochastic gradient descent optimization algorithms
    Jentzen, Arnulf
    Kuckuck, Benno
    Neufeld, Ariel
    von Wurstemberger, Philippe
    [J]. IMA JOURNAL OF NUMERICAL ANALYSIS, 2021, 41 (01) : 455 - 492
  • [8] Berth scheduling for container terminals by using a sub-gradient optimization technique
    Park, KT
    Kim, KH
    [J]. JOURNAL OF THE OPERATIONAL RESEARCH SOCIETY, 2002, 53 (09) : 1054 - 1062
  • [9] Stochastic gradient descent for optimization for nuclear systems
    Austin Williams
    Noah Walton
    Austin Maryanski
    Sandra Bogetic
    Wes Hines
    Vladimir Sobes
    [J]. Scientific Reports, 13
  • [10] Ant colony optimization and stochastic gradient descent
    Meuleau, N
    Dorigo, M
    [J]. ARTIFICIAL LIFE, 2002, 8 (02) : 103 - 121