METHODS FOR SPARSE AND LOW-RANK RECOVERY UNDER SIMPLEX CONSTRAINTS

被引:5
|
作者
Li, Ping [1 ]
Rangapuram, Syama Sundar [2 ,4 ]
Slawski, Martin [3 ]
机构
[1] Baidu Res, Seattle, WA USA
[2] Amazon Res, Berlin, Germany
[3] George Mason Univ, 4400 Univ Dr,MS 4A7, Fairfax, VA 22030 USA
[4] Amazon Dev Ctr, Krausenstr 38, D-10117 Berlin, Germany
关键词
DC programming; density matrices of quantum systems; estimation of mixture proportions; simplex constraints; sparsity; GENERALIZED LINEAR-MODELS; QUANTUM STATE TOMOGRAPHY; DANTZIG SELECTOR; LEAST-SQUARES; LASSO; CONSISTENCY; REGRESSION; MATRICES;
D O I
10.5705/ss.202016.0220
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The de facto standard approach of promoting sparsity by means of l(1)-regularization becomes ineffective in the presence of simplex constraints, that is, when the target is known to have non-negative entries summing to a given constant. The situation is analogous for the use of nuclear norm regularization for the low-rank recovery of Hermitian positive semidefinite matrices with a given trace. In the present paper, we discuss several strategies to deal with this situation, from simple to more complex. First, we consider empirical risk minimization (ERM), which has similar theoretical properties w.r.t. prediction and l(2)-estimation error as l(1)-regularization. In light of this, we argue that ERM combined with a subsequent sparsification step (e.g., thresholding) represents a sound alternative to the heuristic of using l(1)-regularization after dropping the sum constraint and the subsequent normalization. Next, we show that any sparsity-promoting regularizer under simplex constraints cannot be convex. A novel sparsity-promoting regularization scheme based on the inverse or negative of the squared l(2)-norm is proposed, which avoids the shortcomings of various alternative methods from the literature. Our approach naturally extends to Hermitian positive semidefinite matrices with a given trace.
引用
收藏
页码:557 / 577
页数:21
相关论文
共 50 条
  • [31] Blind image recovery approach combing sparse and low-rank regularizations
    Lei Feng
    Jun Zhu
    Lili Huang
    Multimedia Tools and Applications, 2020, 79 : 18059 - 18070
  • [32] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [33] Estimating matching affinity matrices under low-rank constraints
    Dupuy, Arnaud
    Galichon, Alfred
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2019, 8 (04) : 677 - 689
  • [34] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [35] Denoising by low-rank and sparse representations
    Nejati, Mansour
    Samavi, Shadrokh
    Derksen, Harm
    Najarian, Kayvan
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2016, 36 : 28 - 39
  • [36] Sparse and Low-Rank Matrix Decompositions
    Chandrasekaran, Venkat
    Sanghavi, Sujay
    Parrilo, Pablo A.
    Willsky, Alan S.
    2009 47TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING, VOLS 1 AND 2, 2009, : 962 - +
  • [37] Repairing Sparse Low-Rank Texture
    Liang, Xiao
    Ren, Xiang
    Zhang, Zhengdong
    Ma, Yi
    COMPUTER VISION - ECCV 2012, PT V, 2012, 7576 : 482 - 495
  • [38] Low-rank and joint-sparse signal recovery using sparse Bayesian learning in a WBAN
    Zhang, Yan-Bin
    Huang, Long-Ting
    Li, Yang-Qing
    He, Ke-Sen
    Zhang, Kai
    Yin, Chang-Chuan
    MULTIDIMENSIONAL SYSTEMS AND SIGNAL PROCESSING, 2021, 32 (01) : 359 - 379
  • [39] Learning Nonlocal Sparse and Low-Rank Models for Image Compressive Sensing: Nonlocal sparse and low-rank modeling
    Zha, Zhiyuan
    Wen, Bihan
    Yuan, Xin
    Ravishankar, Saiprasad
    Zhou, Jiantao
    Zhu, Ce
    IEEE SIGNAL PROCESSING MAGAZINE, 2023, 40 (01) : 32 - 44
  • [40] Low-rank and joint-sparse signal recovery using sparse Bayesian learning in a WBAN
    Yan-Bin Zhang
    Long-Ting Huang
    Yang-Qing Li
    Ke-Sen He
    Kai Zhang
    Chang-Chuan Yin
    Multidimensional Systems and Signal Processing, 2021, 32 : 359 - 379