POTENTIAL FUNCTION-BASED FRAMEWORK FOR MINIMIZING GRADIENTS IN CONVEX AND MIN-MAX OPTIMIZATION

被引:0
|
作者
Diakonikolas J. [1 ]
Wang P. [1 ]
机构
[1] Department of Computer Sciences, University of Wisconsin-Madison, Madison, 53706, WI
基金
美国国家科学基金会;
关键词
convergence analysis; gradient minimization; potential function;
D O I
10.1137/21M1397246
中图分类号
学科分类号
摘要
Making the gradients small is a fundamental optimization problem that has eluded unifying and simple convergence arguments in first-order optimization, so far primarily reserved for other convergence criteria, such as reducing the optimality gap. In particular, while many different potential function-based frameworks covering broad classes of algorithms exist for optimality gapbased convergence guarantees, we are not aware of such general frameworks addressing the gradient norm guarantees. To fill this gap, we introduce a novel potential function-based framework to study the convergence of standard methods for making the gradients small in smooth convex optimization and convex-concave min-max optimization. Our framework is intuitive and provides a lens for viewing algorithms that makes the gradients small as being driven by a trade-off between reducing either the gradient norm or a certain notion of an optimality gap. On the lower bounds side, we discuss tightness of the obtained convergence results for the convex setup and provide a new lower bound for minimizing norm of cocoercive operators that allows us to argue about optimality of methods in the min-max setup. © 2022 Society for Industrial and Applied Mathematics.
引用
收藏
页码:1668 / 1697
页数:29
相关论文
共 50 条