An improved anchor neighborhood regression SR method based on low-rank constraint

被引:0
|
作者
Xin Yang
Li Liu
Chen Zhu
Yingqing Guo
Dake Zhou
机构
[1] Nanjing University of Aeronautics and Astronautics,College of Automation Engineering
来源
The Visual Computer | 2022年 / 38卷
关键词
Super-resolution; Sparse representation; Low-rank constraint; Anchor neighborhood regression;
D O I
暂无
中图分类号
学科分类号
摘要
At present, the image super-resolution (SR) method based on sparse representation has the problem that the reconstruction speed and quality are difficult to be achieved simultaneously. Therefore, this paper proposes an improved anchor neighborhood regression SR algorithm based on low-rank constraint. Firstly, considering the critical role of locality in nonlinear data learning, the locally weighted regularization weight is introduced in the calculation of the projection matrix, which can constrain the projection process according to the correlation between the anchor point and the atoms in the corresponding neighborhood. Then, in the reconstruction phase, based on the assumption of low-rank between similar blocks, further constraints are made on the reconstruction blocks to obtain better reconstruction image quality. Experiments show that our method can not only reconstruct more image details but also achieve better reconstruction speed. Compared with some state-of-the-art sparse representation method, it achieves better reconstruction results in objective evaluation criteria.
引用
收藏
页码:405 / 418
页数:13
相关论文
共 50 条
  • [41] Multivariate response regression with low-rank and generalized sparsity
    Cho, Youngjin
    Park, Seyoung
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2022, 51 (03) : 847 - 867
  • [42] Augmented low-rank methods for gaussian process regression
    Thomas, Emil
    Sarin, Vivek
    APPLIED INTELLIGENCE, 2022, 52 (02) : 1254 - 1267
  • [43] FAST AND PRIVACY PRESERVING DISTRIBUTED LOW-RANK REGRESSION
    Wai, Hoi-To
    Scaglione, Anna
    Lafond, Jean
    Moulines, Eric
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4451 - 4455
  • [44] Partial Trace Regression and Low-Rank Kraus Decomposition
    Kadri, Hachem
    Ayache, Stephane
    Huusari, Riikka
    Rakotomamonjy, Alain
    Ralaivola, Liva
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [45] Low-Rank Approximation and Regression in Input Sparsity Time
    Clarkson, Kenneth L.
    Woodruff, David P.
    JOURNAL OF THE ACM, 2017, 63 (06)
  • [46] Low-rank discriminative regression learning for image classification
    Lu, Yuwu
    Lai, Zhihui
    Wong, Wai Keung
    Li, Xuelong
    NEURAL NETWORKS, 2020, 125 : 245 - 257
  • [47] Multivariate response regression with low-rank and generalized sparsity
    Youngjin Cho
    Seyoung Park
    Journal of the Korean Statistical Society, 2022, 51 : 847 - 867
  • [48] Low-rank tensor regression for selection of grouped variables
    Chen, Yang
    Luo, Ziyan
    Kong, Lingchen
    JOURNAL OF MULTIVARIATE ANALYSIS, 2024, 203
  • [49] Near Optimal Sketching of Low-Rank Tensor Regression
    Haupt, Jarvis
    Li, Xingguo
    Woodruff, David P.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [50] Greedy low-rank algorithm for spatial connectome regression
    Kuerschner, Patrick
    Dolgov, Sergey
    Harris, Kameron Decker
    Benner, Peter
    JOURNAL OF MATHEMATICAL NEUROSCIENCE, 2019, 9 (01):