l0 Sparsifying Transform Learning With Efficient Optimal Updates and Convergence Guarantees

被引:72
|
作者
Ravishankar, Saiprasad [1 ,2 ]
Bresler, Yoram [1 ,2 ]
机构
[1] Univ Illinois, Dept Elect & Comp Engn, Urbana, IL 61801 USA
[2] Univ Illinois, Coordinated Sci Lab, Urbana, IL 61801 USA
基金
美国国家科学基金会;
关键词
Denoising; dictionary learning; fast algorithms; image representation; non-convex; sparse representation; transform model; K-SVD; SPARSE; ALGORITHM; SEPARATION;
D O I
10.1109/TSP.2015.2405503
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Many applications in signal processing benefit from the sparsity of signals in a certain transform domain or dictionary. Synthesis sparsifying dictionaries that are directly adapted to data have been popular in applications such as image denoising, inpainting, and medical image reconstruction. In this paper, we focus instead on the sparsifying transform model, and study the learning of well-conditioned square sparsifying transforms. The proposed algorithms alternate between a "norm"-based sparse coding step, and a non-convex transform update step. We derive the exact analytical solution for each of these steps. The proposed solution for the transform update step achieves the global minimum in that step, and also provides speedups over iterative solutions involving conjugate gradients. We establish that our alternating algorithms are globally convergent to the set of local minimizers of the nonconvex transform learning problems. In practice, the algorithms are insensitive to initialization. We present results illustrating the promising performance and significant speed-ups of transform learning over synthesis K-SVD in image denoising.
引用
收藏
页码:2389 / 2404
页数:16
相关论文
共 50 条
  • [1] Structured Overcomplete Sparsifying Transform Learning with Convergence Guarantees and Applications
    Bihan Wen
    Saiprasad Ravishankar
    Yoram Bresler
    [J]. International Journal of Computer Vision, 2015, 114 : 137 - 167
  • [2] Structured Overcomplete Sparsifying Transform Learning with Convergence Guarantees and Applications
    Wen, Bihan
    Ravishankar, Saiprasad
    Bresler, Yoram
    [J]. INTERNATIONAL JOURNAL OF COMPUTER VISION, 2015, 114 (2-3) : 137 - 167
  • [3] Lacunary convergence of series in L0
    Drewnowski, L
    Labuda, I
    [J]. PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY, 1998, 126 (06) : 1655 - 1659
  • [4] Image Deconvolution via Efficient Sparsifying Transform Learning
    Akyon, Fatih Cagatay
    Kamaci, Ulas
    Oktem, Figen S.
    [J]. 2018 26TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2018,
  • [5] DOUBLY SPARSE TRANSFORM LEARNING WITH CONVERGENCE GUARANTEES
    Ravishankar, Saiprasad
    Bresler, Yoram
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [6] l0 norm based dictionary learning by proximal methods with global convergence
    Bao, Chenglong
    Ji, Hui
    Quan, Yuhui
    Shen, Zuowei
    [J]. 2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, : 3858 - 3865
  • [7] Online Sparsifying Transform Learning-Part II: Convergence Analysis
    Ravishankar, Saiprasad
    Bresler, Yoram
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2015, 9 (04) : 637 - 646
  • [8] COMPRESSED SENSING MRI WITH COMBINED SPARSIFYING TRANSFORMS AND SMOOTHED l0 NORM MINIMIZATION
    Qu, Xiaobo
    Cao, Xue
    Guo, Di
    Hu, Changwei
    Chen, Zhong
    [J]. 2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 626 - 629
  • [9] Learning Deep l0 Encoders
    Wang, Zhangyang
    Ling, Qing
    Huang, Thomas S.
    [J]. THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2194 - 2200
  • [10] An Efficient Algorithm for Overcomplete Sparsifying Transform Learning with Signal Denoising
    Hou, Beiping
    Zhu, Zhihui
    Li, Gang
    Yu, Aihua
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2016, 2016