Bounds for the Tracking Error of First-Order Online Optimization Methods

被引:7
|
作者
Madden, Liam [1 ]
Becker, Stephen [1 ]
Dall'Anese, Emiliano [2 ]
机构
[1] Univ Colorado Boulder, Dept Appl Math, Boulder, CO 80309 USA
[2] Univ Colorado Boulder, Dept Elect Comp & Energy Engn, Boulder, CO USA
基金
美国国家科学基金会;
关键词
Smooth convex optimization; Online optimization; Convergence bound; Nesterov acceleration; Tikhonov regularization; GRADIENT METHODS; CONVERGENCE; ALGORITHMS;
D O I
10.1007/s10957-021-01836-9
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
This paper investigates online algorithms for smooth time-varying optimization problems, focusing first on methods with constant step-size, momentum, and extrapolation-length. Assuming strong convexity, precise results for the tracking iterate error (the limit supremum of the norm of the difference between the optimal solution and the iterates) for online gradient descent are derived. The paper then considers a general first-order framework, where a universal lower bound on the tracking iterate error is established. Furthermore, a method using "long-steps" is proposed and shown to achieve the lower bound up to a fixed constant. This method is then compared with online gradient descent for specific examples. Finally, the paper analyzes the effect of regularization when the cost is not strongly convex. With regularization, it is possible to achieve a non-regret bound. The paper ends by testing the accelerated and regularized methods on synthetic time-varying least-squares and logistic regression problems, respectively.
引用
收藏
页码:437 / 457
页数:21
相关论文
共 50 条
  • [1] Bounds for the Tracking Error of First-Order Online Optimization Methods
    Liam Madden
    Stephen Becker
    Emiliano Dall’Anese
    [J]. Journal of Optimization Theory and Applications, 2021, 189 : 437 - 457
  • [2] From error bounds to the complexity of first-order descent methods for convex functions
    Bolte, Jerome
    Trong Phong Nguyen
    Peypouquet, Juan
    Suter, Bruce W.
    [J]. MATHEMATICAL PROGRAMMING, 2017, 165 (02) : 471 - 507
  • [3] From error bounds to the complexity of first-order descent methods for convex functions
    Jérôme Bolte
    Trong Phong Nguyen
    Juan Peypouquet
    Bruce W. Suter
    [J]. Mathematical Programming, 2017, 165 : 471 - 507
  • [4] First-order and second-order conditions for error bounds
    Wu, ZL
    Ye, JJ
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2004, 14 (03) : 621 - 645
  • [5] First-Order Methods for Convex Optimization
    Dvurechensky, Pavel
    Shtern, Shimrit
    Staudigl, Mathias
    [J]. EURO JOURNAL ON COMPUTATIONAL OPTIMIZATION, 2021, 9
  • [6] FIRST-ORDER PENALTY METHODS FOR BILEVEL OPTIMIZATION
    Lu, Zhaosong
    Mei, Sanyou
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2024, 34 (02) : 1937 - 1969
  • [7] Control Interpretations for First-Order Optimization Methods
    Hu, Bin
    Lessard, Laurent
    [J]. 2017 AMERICAN CONTROL CONFERENCE (ACC), 2017, : 3114 - 3119
  • [8] Online First-Order Framework for Robust Convex Optimization
    Ho-Nguyen, Nam
    Kilinc-Karzan, Fatma
    [J]. OPERATIONS RESEARCH, 2018, 66 (06) : 1670 - 1692
  • [9] A General Framework for Decentralized Optimization With First-Order Methods
    Xin, Ran
    Pu, Shi
    Nedic, Angelia
    Khan, Usman A.
    [J]. PROCEEDINGS OF THE IEEE, 2020, 108 (11) : 1869 - 1889
  • [10] On the First-Order Optimization Methods in Deep Image Prior
    Cascarano, Pasquale
    Franchini, Giorgia
    Porta, Federica
    Sebastiani, Andrea
    [J]. JOURNAL OF VERIFICATION, VALIDATION AND UNCERTAINTY QUANTIFICATION, 2022, 7 (04):