Convergence of projected subgradient method with sparse or low-rank constraints

被引:0
|
作者
Xu, Hang [1 ]
Li, Song [2 ]
Lin, Junhong [3 ]
机构
[1] Zhejiang Univ, Sch Phys, Hangzhou 310027, Peoples R China
[2] Zhejiang Univ, Sch Math Sci, Hangzhou 310027, Peoples R China
[3] Zhejiang Univ, Ctr Data Sci, Hangzhou 310027, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Sparse constraint; Low-rank constraint; Projected subgradient method; Mixed noises; Nonsmooth formulation; STABLE SIGNAL RECOVERY; MATRIX RECOVERY; OPTIMIZATION; ALGORITHM;
D O I
10.1007/s10444-024-10163-2
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Many problems in data science can be treated as recovering structural signals from a set of linear measurements, sometimes perturbed by dense noise or sparse corruptions. In this paper, we develop a unified framework of considering a nonsmooth formulation with sparse or low-rank constraint for meeting the challenges of mixed noises-bounded noise and sparse noise. We show that the nonsmooth formulations of the problems can be well solved by the projected subgradient methods at a rapid rate when initialized at any points. Consequently, nonsmooth loss functions (& ell;1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _1$$\end{document}-minimization programs) are naturally robust against sparse noise. Our framework simplifies and generalizes the existing analyses including compressed sensing, matrix sensing, quadratic sensing, and bilinear sensing. Motivated by recent work on the stochastic gradient method, we also give some experimentally and theoretically preliminary results about the projected stochastic subgradient method.
引用
收藏
页数:45
相关论文
共 50 条
  • [31] Sparse and low-rank multivariate Hawkes processes
    Bacry, Emmanuel
    Bompaire, Martin
    Gaiffas, Stephane
    Muzy, Jean-Francois
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [32] Pansharpening Based on Low-Rank and Sparse Decomposition
    Rong, Kaixuan
    Jiao, Licheng
    Wang, Shuang
    Liu, Fang
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2014, 7 (12) : 4793 - 4805
  • [33] Sparse subspace clustering with low-rank transformation
    Xu, Gang
    Yang, Mei
    Wu, Qiufeng
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (07): : 3141 - 3154
  • [34] Sparse and Low-Rank Covariance Matrix Estimation
    Zhou S.-L.
    Xiu N.-H.
    Luo Z.-Y.
    Kong L.-C.
    Journal of the Operations Research Society of China, 2015, 3 (02) : 231 - 250
  • [35] Multimodal sparse and low-rank subspace clustering
    Abavisani, Mahdi
    Patel, Vishal M.
    INFORMATION FUSION, 2018, 39 : 168 - 177
  • [36] Low-rank and sparse embedding for dimensionality reduction
    Han, Na
    Wu, Jigang
    Liang, Yingyi
    Fang, Xiaozhao
    Wong, Wai Keung
    Teng, Shaohua
    NEURAL NETWORKS, 2018, 108 : 202 - 216
  • [37] NONNEGATIVE LOW-RANK SPARSE COMPONENT ANALYSIS
    Cohen, Jeremy E.
    Gillis, Nicolas
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 8226 - 8230
  • [38] Parametric PDEs: sparse or low-rank approximations?
    Bachmayr, Markus
    Cohen, Albert
    Dahmen, Wolfgang
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2018, 38 (04) : 1661 - 1708
  • [39] Improved sparse low-rank matrix estimation
    Parekh, Ankit
    Selesnick, Ivan W.
    SIGNAL PROCESSING, 2017, 139 : 62 - 69
  • [40] Boosted Sparse and Low-Rank Tensor Regression
    He, Lifang
    Chen, Kun
    Xu, Wanwan
    Zhou, Jiayu
    Wang, Fei
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31