Nearly Optimal First-Order Methods for Convex Optimization under Gradient Norm Measure: an Adaptive Regularization Approach

被引:0
|
作者
Masaru Ito
Mituhiro Fukuda
机构
[1] Nihon University,Department of Mathematics, College of Science and Technology
[2] Tokyo Institute of Technology,Department of Mathematical and Computing Science
关键词
Smooth/composite convex optimization; Accelerated proximal gradient methods; Hölderian error bound; Adaptive methods; 90C25; 68Q25; 49M37;
D O I
暂无
中图分类号
学科分类号
摘要
In the development of first-order methods for smooth (resp., composite) convex optimization problems, where smooth functions with Lipschitz continuous gradients are minimized, the gradient (resp., gradient mapping) norm becomes a fundamental optimality measure. Under this measure, a fixed iteration algorithm with the optimal iteration complexity for the smooth case is known, while determining this number of iteration to obtain a desired accuracy requires the prior knowledge of the distance from the initial point to the optimal solution set. In this paper, we report an adaptive regularization approach, which attains the nearly optimal iteration complexity without knowing the distance to the optimal solution set. To obtain further faster convergence adaptively, we secondly apply this approach to construct a first-order method that is adaptive to the Hölderian error bound condition (or equivalently, the Łojasiewicz gradient property), which covers moderately wide classes of applications. The proposed method attains nearly optimal iteration complexity with respect to the gradient mapping norm.
引用
收藏
页码:770 / 804
页数:34
相关论文
共 46 条