Performance of noisy three-step accelerated first-order optimization algorithms for strongly convex quadratic problems

被引:0
|
作者
Samuelson, Samantha [1 ]
Mohammadi, Hesameddin [1 ]
Jovanovic, Mihailo R. [1 ]
机构
[1] Univ Southern Calif, Dept Elect & Comp Engn, Los Angeles, CA 90089 USA
关键词
Convex optimization; gradient descent; heavy-ball method; Nesterov's accelerated algorithms; noisy gradients; performance tradeoffs;
D O I
10.1109/CDC49753.2023.10383581
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study the class of first-order algorithms in which the optimization variable is updated using information from three previous iterations. While two-step momentum algorithms akin to heavy-ball and Nesterov's accelerated methods achieve the optimal convergence rate, it is an open question if the three-step momentum method can offer advantages for problems in which exact gradients are not available. For strongly convex quadratic problems, we identify algorithmic parameters which achieve the optimal convergence rate and examine how additional momentum terms affects the tradeoffs between acceleration and noise amplification. Our results suggest that for parameters that optimize the convergence rate, introducing additional momentum terms does not provide improvement in variance amplification relative to standard accelerated algorithms.
引用
收藏
页码:1300 / 1305
页数:6
相关论文
共 42 条