Convergence of projected subgradient method with sparse or low-rank constraints

被引:0
|
作者
Xu, Hang [1 ]
Li, Song [2 ]
Lin, Junhong [3 ]
机构
[1] Zhejiang Univ, Sch Phys, Hangzhou 310027, Peoples R China
[2] Zhejiang Univ, Sch Math Sci, Hangzhou 310027, Peoples R China
[3] Zhejiang Univ, Ctr Data Sci, Hangzhou 310027, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Sparse constraint; Low-rank constraint; Projected subgradient method; Mixed noises; Nonsmooth formulation; STABLE SIGNAL RECOVERY; MATRIX RECOVERY; OPTIMIZATION; ALGORITHM;
D O I
10.1007/s10444-024-10163-2
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Many problems in data science can be treated as recovering structural signals from a set of linear measurements, sometimes perturbed by dense noise or sparse corruptions. In this paper, we develop a unified framework of considering a nonsmooth formulation with sparse or low-rank constraint for meeting the challenges of mixed noises-bounded noise and sparse noise. We show that the nonsmooth formulations of the problems can be well solved by the projected subgradient methods at a rapid rate when initialized at any points. Consequently, nonsmooth loss functions (& ell;1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _1$$\end{document}-minimization programs) are naturally robust against sparse noise. Our framework simplifies and generalizes the existing analyses including compressed sensing, matrix sensing, quadratic sensing, and bilinear sensing. Motivated by recent work on the stochastic gradient method, we also give some experimentally and theoretically preliminary results about the projected stochastic subgradient method.
引用
收藏
页数:45
相关论文
共 50 条
  • [21] Image super-pixels segmentation method based on the non-convex low-rank and sparse constraints
    Zhang, Wenjuan
    Feng, Xiangchu
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2013, 40 (05): : 86 - 91
  • [22] Low-Rank Optimization With Convex Constraints
    Grussler, Christian
    Rantzer, Anders
    Giselsson, Pontus
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2018, 63 (11) : 4000 - 4007
  • [23] Tomographic SAR Imaging Method Based on Sparse and Low-rank Structures
    Zhao, Yao
    Xu, Juncong
    Quan, Xiangyin
    Cui, Li
    Zhang, Zhe
    Journal of Radars, 2022, 11 (01): : 52 - 61
  • [24] A separable surrogate function method for sparse and low-rank matrices decomposition
    Liu, Zisheng
    Li, Jicheng
    Liu, Xuenian
    OPTIMIZATION, 2020, 69 (05) : 1117 - 1149
  • [25] SPARSE AND LOW-RANK MATRIX DECOMPOSITION VIA ALTERNATING DIRECTION METHOD
    Yuan, Xiaoming
    Yang, Junfeng
    PACIFIC JOURNAL OF OPTIMIZATION, 2013, 9 (01): : 167 - 180
  • [26] A cyclic low-rank smith method for large sparse Lyapunov equations
    Penzl, T
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2000, 21 (04): : 1401 - 1418
  • [27] UNSUPERVISED MONAURAL SPEECH ENHANCEMENT USING ROBUST NMF WITH LOW-RANK AND SPARSE CONSTRAINTS
    Li, Yinan
    Zhang, Xiongwei
    Sun, Meng
    Min, Gang
    2015 IEEE CHINA SUMMIT & INTERNATIONAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING, 2015, : 1 - 4
  • [28] Brain Lesion Segmentation Based on Joint Constraints of Low-Rank Representation and Sparse Representation
    Ge, Ting
    Mu, Ning
    Zhan, Tianming
    Chen, Zhi
    Gao, Wanrong
    Mu, Shanxiang
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2019, 2019
  • [29] Spectral weighted sparse unmixing based on adaptive total variation and low-rank constraints
    Chenguang Xu
    Scientific Reports, 14 (1)
  • [30] Sparse and low-rank multivariate hawkes processes
    Bacry, Emmanuel
    Bompaire, Martin
    Gaïffas, Stéphane
    Muzy, Jean-Francois
    Journal of Machine Learning Research, 2020, 21