The equivalence of half-quadratic minimization and the gradient linearization iteration

被引:101
|
作者
Nikolova, Mila [1 ]
Chan, Raymond H.
机构
[1] ENS Cachan, Ctr Math & Applicat, CNRS UMR 8536, F-94235 Cachan, France
[2] Chinese Univ Hong Kong, Dept Math, Shatin, Hong Kong, Peoples R China
关键词
gradient linearization; half-quadratic (HQ) regularization; inverse problems; optimization; signal and image restoration; variational methods;
D O I
10.1109/TIP.2007.896622
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A popular way to restore images comprising edges is to minimize a cost function combining a quadratic data-fidelity term and an edge-preserving (possibly nonconvex) regularization term. Mainly because of the latter term, the calculation of the solution is slow and cumbersome. Half-quadratic (HQ) minimization (multiplicative form) was pioneered by Geman and Reynolds (1992) in order to alleviate the computational task in the context of image reconstruction with nonconvex regularization. By promoting the idea of locally homogeneous image models with a continuous-valued line process, they reformulated the optimization problem in terms of an augmented cost function which is quadratic with respect to the image and separable with respect to the line process, hence the name "half quadratic." Since then, a large amount of papers were dedicated to HQ minimization and important results-including edge-preservation along with convex regularization and convergence-have been obtained. In this paper, we show that HQ minimization (multiplicative form) is equivalent to the most simple and basic method where the gradient of the cost function is linearized at each iteration step. In fact, both methods give exactly the same iterations. Furthermore, connections of HQ minimization with other methods, such as the quasi-Newton method and the generalized Weiszfeld's method, are straightforward.
引用
收藏
页码:1623 / 1627
页数:5
相关论文
共 50 条
  • [1] Structured Sparsity via Half-Quadratic Minimization
    Wei, Jinghuan
    Li, Zhihang
    Cao, Dong
    Zhang, Man
    Zeng, Cheng
    [J]. ADVANCES IN IMAGE AND GRAPHICS TECHNOLOGIES, IGTA 2016, 2016, 634 : 137 - 148
  • [2] Vectorial additive half-quadratic minimization for isotropic regularization
    Li, Wen-Ping
    Wang, Zheng-Ming
    Zhang, Tao
    Bai, Bao-Cun
    Ding, Yong-He
    Deng, Ya
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2015, 281 : 152 - 168
  • [3] Robust Subspace Clustering via Half-Quadratic Minimization
    Zhang, Yingya
    Sun, Zhenan
    He, Ran
    Tan, Tieniu
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2013, : 3096 - 3103
  • [4] Conjugate gradient method preconditioned with modified block SSOR iteration for multiplicative half-quadratic image restoration
    Pei-Pei Zhao
    Yu-Mei Huang
    [J]. Calcolo, 2020, 57
  • [5] Conjugate gradient method preconditioned with modified block SSOR iteration for multiplicative half-quadratic image restoration
    Zhao, Pei-Pei
    Huang, Yu-Mei
    [J]. CALCOLO, 2020, 57 (03)
  • [6] RESTORATION OF MANIFOLD-VALUED IMAGES BY HALF-QUADRATIC MINIMIZATION
    Bergmann, Ronny
    Chan, Raymond H.
    Hielscher, Ralf
    Persch, Johannes
    Steidl, Gabriele
    [J]. INVERSE PROBLEMS AND IMAGING, 2016, 10 (02) : 281 - 304
  • [7] Conjugate gradient method preconditioned with modified block SSOR iteration for multiplicative half-quadratic image restoration
    Zhao, Pei-Pei
    Huang, Yu-Mei
    [J]. Calcolo, 2020, 57 (03):
  • [8] Analysis of half-quadratic minimization methods for signal and image recovery
    Nikolova, M
    Ng, MK
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2005, 27 (03): : 937 - 966
  • [9] Robust Nonnegative Matrix Factorization Via Half-Quadratic Minimization
    Du, Liang
    Li, Xuan
    Shen, Yi-Dong
    [J]. 12TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2012), 2012, : 201 - 210
  • [10] Half-Quadratic Minimization for Unsupervised Feature Selection on Incomplete Data
    Shen, Heng Tao
    Zhu, Yonghua
    Zheng, Wei
    Zhu, Xiaofeng
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (07) : 3122 - 3135