MTKDSR: Multi-Teacher Knowledge Distillation for Super Resolution Image Reconstruction

被引:2
|
作者
Yao, Gengqi [1 ]
Li, Zhan [1 ]
Bhanu, Bir [2 ]
Kang, Zhiqing [1 ]
Zhong, Ziyi [1 ]
Zhang, Qingfeng [1 ]
机构
[1] Jinan Univ, Dept Comp Sci, Guangzhou, Peoples R China
[2] Univ Calif Riverside, Dept Elect & Comp Engn, Riverside, CA USA
基金
中国国家自然科学基金;
关键词
CONVOLUTIONAL NETWORK; SUPERRESOLUTION;
D O I
10.1109/ICPR56361.2022.9956250
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, the performance of single image super-resolution (SISR) methods based on deep neural networks has significantly improved. However, large model sizes and high computational costs are common problems for most SR networks. Meanwhile, a trade-off exists between higher reconstruction fidelity and improved perceptual quality in solving the SISR problem. In this paper, we propose a multi-teacher knowledge distillation approach for SR (MTKDSR) tasks that can train a balanced, lightweight, and efficient student network using different types of teacher models that are proficient in terms of reconstruction fidelity or perceptual quality. In addition, to generate more realistic and learnable textures, we propose an edge-guided SR network, EdgeSRN, as a perceptual teacher used in the MTKDSR framework. In our experiments, EdgeSRN was superior to the models based on adversarial learning in terms of the ability of effective knowledge transfer. Extensive experiments show that the student trained by MTKDSR exhibit superior performance compared to those of state-of-the-art lightweight SR networks in terms of perceptual quality with a smaller model size and fewer computations. Our code is available at https: //github. com/lizhangray/MTKDSR.
引用
收藏
页码:352 / 358
页数:7
相关论文
共 50 条
  • [41] Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation
    Li, Ruoyu
    Yun, Lijun
    Zhang, Mingxuan
    Yang, Yanchen
    Cheng, Feiyan
    SENSORS, 2023, 23 (22)
  • [42] Faster, Lighter, Stronger: Image Rectangling Using Multi-Teacher Instance-Level Distillation
    Mei, Yuan
    Yang, Lichun
    Wang, Mengsi
    Gao, Yidan
    Wu, Kaijun
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (03) : 5441 - 5452
  • [43] MTMS: Multi-teacher Multi-stage Knowledge Distillation for Reasoning-Based Machine Reading Comprehension
    Zhao, Zhuo
    Xie, Zhiwen
    Zhou, Guangyou
    Huang, Jimmy Xiangji
    PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 1995 - 2005
  • [44] Multi-Teacher Distillation With Single Model for Neural Machine Translation
    Liang, Xiaobo
    Wu, Lijun
    Li, Juntao
    Qin, Tao
    Zhang, Min
    Liu, Tie-Yan
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2022, 30 : 992 - 1002
  • [45] Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation
    Cao, Shengcao
    Li, Mengtian
    Hays, James
    Ramanan, Deva
    Wang, Yu-Xiong
    Gui, Liang-Yan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [46] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System
    Yang, Ze
    Shou, Linjun
    Gong, Ming
    Lin, Wutao
    Jiang, Daxin
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
  • [47] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation
    Huang, Chong
    Lin, Shaohui
    Zhang, Yan
    Li, Ke
    Zhang, Baochang
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41
  • [48] Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks
    Cuong Pham
    Tuan Hoang
    Thanh-Toan Do
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6424 - 6432
  • [49] Dissolved oxygen prediction in the Taiwan Strait with the attention-based multi-teacher knowledge distillation model
    Chen, Lei
    Lin, Ye
    Guo, Minquan
    Lu, Wenfang
    Li, Xueding
    Zhang, Zhenchang
    OCEAN & COASTAL MANAGEMENT, 2025, 265
  • [50] Learning Semantic Textual Similarity via Multi-Teacher Knowledge Distillation: A Multiple Data Augmentation method
    Lu, Zhikun
    Zhao, Ying
    Li, Jinnan
    Tian, Yuan
    2024 9TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS, ICCCS 2024, 2024, : 1197 - 1203