FADLSR: A Lightweight Super-Resolution Network Based on Feature Asymmetric Distillation

被引:0
|
作者
Xin Yang
Hengrui Li
Hanying Jian
Tao Li
机构
[1] Nanjing University of Aeronautics and Astronautics,School of Automation Engineering
关键词
Super-resolution; Lightweight network; Feature distillation; Asymmetric convolution; Residual network;
D O I
暂无
中图分类号
学科分类号
摘要
Super-resolution (SR) technology based on deep learning has achieved excellent results. However, too many convolution layers and parameters consume a very high computational cost and storage space when training the model, which dramatically limits the practical application. To solve this problem, this paper proposes a lightweight feature asymmetric distillation SR network (FADLSR). FADLSR constructs the feature extractor module through the stacked feature asymmetric distillation block (FADB). It extracts the low-resolution image features hierarchically and integrates them to obtain more representative features to improve the SR quality. In addition, we design a new focus block and add it to FADB to improve the quality of feature acquisition. We also introduce asymmetric convolution to strengthen the key features of the skeleton region. Detailed experiments show that our FADLSR has achieved excellent results in objective evaluation criteria and subjective visual effect on the test sets of Set5, Set14, B100, Urban100, and Manga109. The parameters of our model are roughly the same as those of the current state-of-the-art models. Moreover, in terms of model performance, FADLSR is 10–15% higher than the comparison algorithms.
引用
收藏
页码:2149 / 2168
页数:19
相关论文
共 50 条
  • [21] Lightweight Single Image Super-resolution with Dense Connection Distillation Network
    Li, Yanchun
    Cao, Jianglian
    Li, Zhetao
    Oh, Sangyoon
    Komuro, Nobuyoshi
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2021, 17 (01)
  • [22] Dynamic feature distillation and pyramid split large kernel attention network for lightweight image super-resolution
    Liu, Bingzan
    Ning, Xin
    Ma, Shichao
    Yang, Yizhen
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (33) : 79963 - 79984
  • [23] Lightweight Image Super-Resolution with Information Multi-distillation Network
    Hui, Zheng
    Gao, Xinbo
    Yang, Yunchu
    Wang, Xiumei
    PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA (MM'19), 2019, : 2024 - 2032
  • [24] LSRN-AED: lightweight super-resolution network based on asymmetric encoder–decoder
    Huang S.
    Li W.
    Yang Y.
    Wan W.
    Lai H.
    Soft Computing, 2024, 28 (13-14) : 8513 - 8525
  • [25] Edge-enhanced Feature Distillation Network for Efficient Super-Resolution
    Wang, Yan
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 776 - 784
  • [26] Feature distillation network for efficient super-resolution with vast receptive field
    Zhang, Yanfeng
    Tan, Wenan
    Mao, Wenyi
    SIGNAL IMAGE AND VIDEO PROCESSING, 2025, 19 (02)
  • [27] Lightweight Remote-Sensing Image Super-Resolution via Re-Parameterized Feature Distillation Network
    Zhang, Tianlin
    Bian, Chunjiang
    Zhang, Xiaoming
    Chen, Hongzhen
    Chen, Shi
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [28] Lightweight multi-scale attention feature distillation network for super-resolution reconstruction of digital rock images
    Zhang, Yubo
    Bi, Junhao
    Xu, Lei
    Xiang, Haibin
    Kong, Haihua
    Han, Chao
    GEOENERGY SCIENCE AND ENGINEERING, 2025, 246
  • [29] PFFN: Progressive Feature Fusion Network for Lightweight Image Super-Resolution
    Zhang, Dongyang
    Li, Changyu
    Xie, Ning
    Wang, Guoqing
    Shao, Jie
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 3682 - 3690
  • [30] CFGN: A Lightweight Context Feature Guided Network for Image Super-Resolution
    Dai, Tao
    Ya, Mengxi
    Li, Jinmin
    Zhang, Xinyi
    Xia, Shu-Tao
    Zhu, Zexuan
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (01): : 855 - 865