Nonlinear residual minimization by iteratively reweighted least squares

被引:0
|
作者
Juliane Sigl
机构
[1] Technische Universität München,
关键词
Minimal norm residual of nonlinear equations; Iteratively reweighted least squares; Phase retrieval;
D O I
暂无
中图分类号
学科分类号
摘要
In this paper we address the numerical solution of minimal norm residuals of nonlinear equations in finite dimensions. We take particularly inspiration from the problem of finding a sparse vector solution of phase retrieval problems by using greedy algorithms based on iterative residual minimizations in the ℓp\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _p$$\end{document}-norm, for 1≤p≤2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1 \le p \le 2$$\end{document}. Due to the mild smoothness of the problem, especially for p→1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$p \rightarrow 1$$\end{document}, we develop and analyze a generalized version of iteratively reweighted least squares (IRLS). This simple and efficient algorithm performs the solution of optimization problems involving non-quadratic possibly non-convex and non-smooth cost functions, which can be transformed into a sequence of common least squares problems. The latter can be tackled eventually by more efficient numerical optimization methods. While its analysis has been by now developed in many different contexts (e.g., for sparse vector, low-rank matrix optimization, and for the solution of PDE involving p-Laplacians) when the model equation is linear, no results are up to now provided in case of nonlinear ones. We address here precisely the convergence and the rate of error decay of IRLS for such nonlinear problems. The analysis of the convergence of the algorithm is based on its reformulation as an alternating minimization of an energy functional. In fact its main variables are the competitors to solutions of the intermediate reweighted least squares problems and their weights. Under a specific condition of coercivity often verified in practice and assumptions of local convexity, we are able to show convergence of IRLS to minimizers of the nonlinear residual problem. For the case where we are lacking the local convexity, we propose an appropriate convexification by quadratic perturbations. Eventually we are able to show convergence of this modified procedure to at least a very good approximation of stationary points of the original problem. In order to illustrate the theoretical results we conclude the paper with several numerical experiments. We first compare IRLS with standard Matlab optimization functions for a simple and easily presentable example. Furthermore we numerically validate our theoretical results in the more complicated framework of phase retrieval problems, which are our main motivation. Finally we examine the recovery capability of the algorithm in the context of data corrupted by impulsive noise where the sparsification of the residual is desired.
引用
收藏
页码:755 / 792
页数:37
相关论文
共 50 条
  • [1] Nonlinear residual minimization by iteratively reweighted least squares
    Sigl, Juliane
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2016, 64 (03) : 755 - 792
  • [2] Iteratively Reweighted Least Squares Minimization for Sparse Recovery
    Daubechies, Ingrid
    Devore, Ronald
    Fornasier, Massimo
    Guentuerk, C. Sinan
    [J]. COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2010, 63 (01) : 1 - 38
  • [3] Fast Iteratively Reweighted Least Squares Minimization for Sparse Recovery
    Liu, Kaihui
    Wan, Liangtian
    Wang, Feiyu
    [J]. 2018 IEEE 23RD INTERNATIONAL CONFERENCE ON DIGITAL SIGNAL PROCESSING (DSP), 2018,
  • [4] IMPROVED ITERATIVELY REWEIGHTED LEAST SQUARES FOR UNCONSTRAINED SMOOTHED lq MINIMIZATION
    Lai, Ming-Jun
    Xu, Yangyang
    Yin, Wotao
    [J]. SIAM JOURNAL ON NUMERICAL ANALYSIS, 2013, 51 (02) : 927 - 957
  • [5] Iteratively reweighted least squares based learning
    Warner, BA
    Misra, M
    [J]. IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 1327 - 1331
  • [6] Distributed iteratively reweighted least squares and applications
    Chen, Colin
    [J]. STATISTICS AND ITS INTERFACE, 2013, 6 (04) : 585 - 593
  • [7] Smoothed Low Rank and Sparse Matrix Recovery by Iteratively Reweighted Least Squares Minimization
    Lu, Canyi
    Lin, Zhouchen
    Yan, Shuicheng
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (02) : 646 - 654
  • [8] LOW-RANK MATRIX RECOVERY VIA ITERATIVELY REWEIGHTED LEAST SQUARES MINIMIZATION
    Fornasier, Massimo
    Rauhut, Holger
    Ward, Rachel
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2011, 21 (04) : 1614 - 1640
  • [9] Alternating Iteratively Reweighted Least Squares Minimization for Low-Rank Matrix Factorization
    Giampouras, Paris V.
    Rontogiannis, Athanasios A.
    Koutroumbas, Konstantinos D.
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2019, 67 (02) : 490 - 503
  • [10] Percentile curves via iteratively reweighted least squares
    Merz, PH
    [J]. ZEITSCHRIFT FUR ANGEWANDTE MATHEMATIK UND MECHANIK, 1996, 76 : 511 - 512