Lightweight global-locally connected distillation network for single image super-resolution

被引:3
|
作者
Zeng, Cong [1 ]
Li, Guangyao [1 ]
Chen, Qiaochuan [2 ]
Xiao, Qingguo [3 ]
机构
[1] Tongji Univ, Coll Elect & Informat Engn, Shanghai 201804, Peoples R China
[2] Shanghai Univ, Sch Comp Engn & Sci, Shanghai 200444, Peoples R China
[3] Linyi Univ, Sch Automat & Elect Engn, Linyi 276000, Shandong, Peoples R China
基金
中国国家自然科学基金;
关键词
Convolutional neural network; Image super-resolution; Global-local connection method; Information distillation; Wide activation; ACCURATE;
D O I
10.1007/s10489-022-03454-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As convolutional neural networks (CNNs) have been commonly applied to ill-posed single image super-resolution (SISR) task, most previous CNN-based methods made significant progress in terms of both high signal-to-noise ratios (PSNR) and structural similarity (SSIM). However, with the layers in those networks going deeper and deeper, they require more and more computing power, fail to consider distilling the feature maps. In this paper, we propose a lightweight global-locally connected distillation network, GLCDNet. Specifically, we propose a wide activation shrink-expand convolutional block whose filter channels will first shrink then expand to aggregate more information. This information will concatenate with feature maps of the previous blocks to further explore shallow information. Thus, the block will exploit statistics within most feature channels while refining useful information of features. Furthermore, together with the global-local connection method, our network is robust to benchmark datasets with high processing speed. Comparative results demonstrate that our GLCDNet achieves superior performance while keeping the parameters and speed balanced.
引用
收藏
页码:17797 / 17809
页数:13
相关论文
共 50 条
  • [31] Balanced Spatial Feature Distillation and Pyramid Attention Network for Lightweight Image Super-resolution
    Gendy, Garas
    Sabor, Nabil
    Hou, Jingchao
    He, Guanghui
    NEUROCOMPUTING, 2022, 509 (157-166) : 157 - 166
  • [32] DDistill-SR: Reparameterized Dynamic Distillation Network for Lightweight Image Super-Resolution
    Wang, Yan
    Su, Tongtong
    Li, Yusen
    Cao, Jiuwen
    Wang, Gang
    Liu, Xiaoguang
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 7222 - 7234
  • [33] Lightweight Inverse Separable Residual Information Distillation Network for Image Super-Resolution Reconstruction
    Zhao X.
    Li X.
    Song Z.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2023, 36 (05): : 419 - 432
  • [34] Lightweight image super-resolution with group-convolutional feature enhanced distillation network
    Wei Zhang
    Zhongqiang Fan
    Yan Song
    Yagang Wang
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 2467 - 2482
  • [35] Lightweight image super-resolution with group-convolutional feature enhanced distillation network
    Zhang, Wei
    Fan, Zhongqiang
    Song, Yan
    Wang, Yagang
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (07) : 2467 - 2482
  • [36] An efficient and lightweight image super-resolution with feature network
    Zang, Yongsheng
    Zhou, Dongming
    Wang, Changcheng
    Nie, Rencan
    Guo, Yanbu
    OPTIK, 2022, 255
  • [37] Fast and Accurate Single Image Super-Resolution via Information Distillation Network
    Hui, Zheng
    Wang, Xiumei
    Gao, Xinbo
    2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 723 - 731
  • [38] Lightweight Image Super-Resolution with ConvNeXt Residual Network
    Zhang, Yong
    Bai, Haomou
    Bing, Yaxing
    Liang, Xiao
    NEURAL PROCESSING LETTERS, 2023, 55 (07) : 9545 - 9561
  • [39] Lightweight subpixel sampling network for image super-resolution
    Zeng, Hongfei
    Wu, Qiang
    Zhang, Jin
    Xia, Haojie
    VISUAL COMPUTER, 2024, 40 (05): : 3781 - 3793
  • [40] Lightweight bidirectional feedback network for image super-resolution
    Wang, Beibei
    Yan, Binyu
    Liu, Changjun
    Hwangbo, Ryul
    Jeon, Gwanggil
    Yang, Xiaomin
    COMPUTERS & ELECTRICAL ENGINEERING, 2022, 102