On the convergence of a Block-Coordinate Incremental Gradient method

被引:0
|
作者
Laura Palagi
Ruggiero Seccia
机构
[1] Sapienza University of Rome,Department of Computer, Control and Management Engineering A. Ruberti
来源
Soft Computing | 2021年 / 25卷
关键词
Incremental gradient; Block-coordinate decomposition; Online optimization;
D O I
暂无
中图分类号
学科分类号
摘要
In this paper, we study the convergence of a block-coordinate incremental gradient method. Under some specific assumptions on the objective function, we prove that the block-coordinate incremental gradient method can be seen as a gradient method with errors and convergence can be proved by showing the error at each iteration satisfies some standard conditions. Thus, we can prove convergence towards stationary points when the block incremental gradient method is coupled with a diminishing stepsize and towards an ϵ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\epsilon $$\end{document}-approximate solution when a bounded away from zero stepsize is employed.
引用
收藏
页码:12615 / 12626
页数:11
相关论文
共 50 条
  • [1] On the convergence of a Block-Coordinate Incremental Gradient method
    Palagi, Laura
    Seccia, Ruggiero
    [J]. SOFT COMPUTING, 2021, 25 (19) : 12615 - 12626
  • [2] Asynchronous Incremental Block-Coordinate Descent
    Aytekin, Arda
    Feyzmahdavian, Hamid Reza
    Johansson, Mikael
    [J]. 2014 52ND ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2014, : 19 - 24
  • [3] Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
    Puya Latafat
    Andreas Themelis
    Panagiotis Patrinos
    [J]. Mathematical Programming, 2022, 193 : 195 - 224
  • [4] Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
    Latafat, Puya
    Themelis, Andreas
    Patrinos, Panagiotis
    [J]. MATHEMATICAL PROGRAMMING, 2022, 193 (01) : 195 - 224
  • [5] Random Block-Coordinate Gradient Projection Algorithms
    Singh, Chandramani
    Nedic, Angelia
    Srikant, R.
    [J]. 2014 IEEE 53RD ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2014, : 185 - 190
  • [6] Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization
    P. Tseng
    S. Yun
    [J]. Journal of Optimization Theory and Applications, 2009, 140
  • [7] Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization
    Tseng, P.
    Yun, S.
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2009, 140 (03) : 513 - 535
  • [8] LINEAR CONVERGENCE OF STOCHASTIC BLOCK-COORDINATE FIXED POINT ALGORITHMS
    Combettes, Patrick L.
    Pesquet, Jean-Christophe
    [J]. 2018 26TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2018, : 742 - 746
  • [9] Convergence rate of block-coordinate maximization Burer–Monteiro method for solving large SDPs
    Murat A. Erdogdu
    Asuman Ozdaglar
    Pablo A. Parrilo
    Nuri Denizcan Vanli
    [J]. Mathematical Programming, 2022, 195 : 243 - 281
  • [10] Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization
    Li, Zhigang
    Zhang, Mingchuan
    Zhu, Junlong
    Zheng, Ruijuan
    Zhang, Qikun
    Wu, Qingtao
    [J]. COMPLEXITY, 2018,