Stochastic Primal-Dual Hybrid Gradient Algorithm with Adaptive Step Sizes

被引:2
|
作者
Chambolle, Antonin [1 ,2 ]
Delplancke, Claire [3 ]
Ehrhardt, Matthias J. [4 ]
Schonlieb, Carola-Bibiane [5 ]
Tang, Junqi [6 ]
机构
[1] Univ Paris 09, CEREMADE, Pl Marechal De Lattre De Tassigny, F-75775 Paris, France
[2] INRIA Paris, MOKAPLAN, Paris, France
[3] EDF Lab Paris Saclay, Route Saclay, F-91300 Palaiseau, France
[4] Univ Bath, Dept Math Sci, Bath BA2 7AY, England
[5] Univ Cambridge, Dept Appl Math & Theoret Phys, Wilberforce Rd, Cambridge CB3 0WA, England
[6] Univ Birmingham, Sch Math, Birmingham B15 2TT, England
基金
英国工程与自然科学研究理事会;
关键词
CONVERGENCE;
D O I
10.1007/s10851-024-01174-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work, we propose a new primal-dual algorithm with adaptive step sizes. The stochastic primal-dual hybrid gradient (SPDHG) algorithm with constant step sizes has become widely applied in large-scale convex optimization across many scientific fields due to its scalability. While the product of the primal and dual step sizes is subject to an upper-bound in order to ensure convergence, the selection of the ratio of the step sizes is critical in applications. Up-to-now there is no systematic and successful way of selecting the primal and dual step sizes for SPDHG. In this work, we propose a general class of adaptive SPDHG (A-SPDHG) algorithms and prove their convergence under weak assumptions. We also propose concrete parameters-updating strategies which satisfy the assumptions of our theory and thereby lead to convergent algorithms. Numerical examples on computed tomography demonstrate the effectiveness of the proposed schemes.
引用
收藏
页码:294 / 313
页数:20
相关论文
共 50 条
  • [1] Stochastic Primal–Dual Hybrid Gradient Algorithm with Adaptive Step Sizes
    Antonin Chambolle
    Claire Delplancke
    Matthias J. Ehrhardt
    Carola-Bibiane Schönlieb
    Junqi Tang
    [J]. Journal of Mathematical Imaging and Vision, 2024, 66 : 294 - 313
  • [2] ON THE CONVERGENCE OF STOCHASTIC PRIMAL-DUAL HYBRID GRADIENT
    Alacaoglu, Ahmet
    Fercoq, Olivier
    Cevher, Volkan
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2022, 32 (02) : 1288 - 1318
  • [3] On the Convergence of Primal-Dual Hybrid Gradient Algorithm
    He, Bingsheng
    You, Yanfei
    Yuan, Xiaoming
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2014, 7 (04): : 2526 - 2537
  • [4] STOCHASTIC PRIMAL-DUAL HYBRID GRADIENT ALGORITHM WITH ARBITRARY SAMPLING AND IMAGING APPLICATIONS
    Chambolle, Antonin
    Ehrhardt, Matthias J.
    Richtarik, Peter
    Schonlieb, Carola-Bibiane
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (04) : 2783 - 2808
  • [5] Precompact convergence of the nonconvex Primal-Dual Hybrid Gradient algorithm
    Sun, Tao
    Barrio, Roberto
    Cheng, Lizhi
    Jiang, Hao
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2018, 330 : 15 - 27
  • [6] Several Variants of the Primal-Dual Hybrid Gradient Algorithm with Applications
    Bai, Jianchao
    Li, Jicheng
    Wu, Zhie
    [J]. NUMERICAL MATHEMATICS-THEORY METHODS AND APPLICATIONS, 2020, 13 (01) : 176 - 199
  • [7] A fully stochastic primal-dual algorithm
    Bianchi, Pascal
    Hachem, Walid
    Salim, Adil
    [J]. OPTIMIZATION LETTERS, 2021, 15 (02) : 701 - 710
  • [8] A fully stochastic primal-dual algorithm
    Pascal Bianchi
    Walid Hachem
    Adil Salim
    [J]. Optimization Letters, 2021, 15 : 701 - 710
  • [9] Faster PET Reconstruction with a Stochastic Primal-Dual Hybrid Gradient Method
    Ehrhardt, Matthias J.
    Markiewicz, Pawel
    Chambolle, Antonin
    Richtarik, Peter
    Schott, Jonathan
    Schonlieb, Carola-Bibiane
    [J]. WAVELETS AND SPARSITY XVII, 2017, 10394
  • [10] On Stochastic Primal-Dual Hybrid Gradient Approach for Compositely Regularized Minimization
    Qiao, Linbo
    Lin, Tianyi
    Jiang, Yu-Gang
    Yang, Fan
    Liu, Wei
    Lu, Xicheng
    [J]. ECAI 2016: 22ND EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, 285 : 167 - 174