A Projected Extrapolated Gradient Method with Larger Step Size for Monotone Variational Inequalities

被引:0
|
作者
Xiaokai Chang
Jianchao Bai
机构
[1] Lanzhou University of Technology,School of Science
[2] Northwestern Polytechnical University,School of Mathematics and Statistics
关键词
Variational inequality; Projected gradient method; Convex optimization; Predict–correct step size; 47J20; 65C10; 65C15; 90C33;
D O I
暂无
中图分类号
学科分类号
摘要
A projected extrapolated gradient method is designed for solving monotone variational inequality in Hilbert space. Requiring local Lipschitz continuity of the operator, our proposed method improves the value of the extrapolated parameter and admits larger step sizes, which are predicted based a local information of the involved operator and corrected by bounding the distance between each pair of successive iterates. The correction will be implemented when the distance is larger than a given constant and its main cost is to compute a projection onto the feasible set. In particular, when the operator is the gradient of a convex function, the correction step is not necessary. We establish the convergence and ergodic convergence rate in theory under the larger range of parameters. Related numerical experiments illustrate the improvements in efficiency from the larger step sizes.
引用
收藏
页码:602 / 627
页数:25
相关论文
共 50 条