Accelerated Additive Schwarz Methods for Convex Optimization with Adaptive Restart

被引:5
|
作者
Park, Jongho [1 ]
机构
[1] Korea Adv Inst Sci & Technol, Nat Sci Res Inst, Daejeon 34141, South Korea
基金
新加坡国家研究基金会;
关键词
Additive Schwarz method; Acceleration; Adaptive restart; Convex optimization; DOMAIN DECOMPOSITION METHODS; OSHER-FATEMI MODEL; CONVERGENCE RATE; 1ST-ORDER METHODS; MINIMIZATION;
D O I
10.1007/s10915-021-01648-z
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Based on an observation that additive Schwarz methods for general convex optimization can be interpreted as gradient methods, we propose an acceleration scheme for additive Schwarz methods. Adopting acceleration techniques developed for gradient methods such as momentum and adaptive restarting, the convergence rate of additive Schwarz methods is greatly improved. The proposed acceleration scheme does not require any a priori information on the levels of smoothness and sharpness of a target energy functional, so that it can be applied to various convex optimization problems. Numerical results for linear elliptic problems, nonlinear elliptic problems, nonsmooth problems, and nonsharp problems are provided to highlight the superiority and the broad applicability of the proposed scheme.
引用
收藏
页数:20
相关论文
共 50 条