FNSAM: Image super-resolution using a feedback network with self-attention mechanism

被引:2
|
作者
Huang, Yu [1 ]
Wang, Wenqian [1 ]
Li, Min [1 ]
机构
[1] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
MRI image; super-resolution; feedback network; self-attention mechanism; convolutional neural network;
D O I
10.3233/THC-236033
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
BACKGROUND: High-resolution (HR) magnetic resonance imaging (MRI) provides rich pathological information which is of great significance in diagnosis and treatment of brain lesions. However, obtaining HR brain MRI images comes at the cost of extending scan time and using sophisticated expensive instruments. OBJECTIVE: This study aims to reconstruct HR MRI images from low-resolution (LR) images by developing a deep learning based super-resolution (SR) method. METHODS: We propose a feedback network with self-attention mechanism (FNSAM) for SR reconstruction of brain MRI images. Specifically, a feedback network is built to correct shallow features by using a recurrent neural network (RNN) and the self-attention mechanism (SAM) is integrated into the feedback network for extraction of important information as the feedback signal, which promotes image hierarchy. RESULTS: Experimental results show that the proposed FNSAM obtains more reasonable SR reconstruction of brain MRI images both in peak signal to noise ratio (PSNR) and structural similarity index measure (SSIM) than some state-of-the-arts. CONCLUSION: Our proposed method is suitable for SR reconstruction of MRI images.
引用
收藏
页码:S383 / S395
页数:13
相关论文
共 50 条
  • [1] Attention mechanism feedback network for image super-resolution
    Chen, Xiao
    Jing, Ruyun
    Suna, Chaowen
    [J]. JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (04)
  • [2] Self-attention negative feedback network for real-time image super-resolution
    Liu, Xiangbin
    Chen, Shuqi
    Song, Liping
    Wozniak, Marcin
    Liu, Shuai
    [J]. JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2022, 34 (08) : 6179 - 6186
  • [3] Non-local self-attention network for image super-resolution
    Zeng, Kun
    Lin, Hanjiang
    Yan, Zhiqiang
    Fang, Jinsheng
    Lai, Taotao
    [J]. APPLIED INTELLIGENCE, 2024, 54 (07) : 5336 - 5352
  • [4] Self-attention learning network for face super-resolution
    Zeng, Kangli
    Wang, Zhongyuan
    Lu, Tao
    Chen, Jianyu
    Wang, Jiaming
    Xiong, Zixiang
    [J]. NEURAL NETWORKS, 2023, 160 : 164 - 174
  • [5] Image Super-Resolution Reconstruction Method Based on Self-Attention Deep Network
    Chen Zihan
    Wu Haobo
    Pei Haodong
    Chen Rong
    Hu Jiaxin
    Shi Hengtong
    [J]. LASER & OPTOELECTRONICS PROGRESS, 2021, 58 (04)
  • [6] A Dynamic Residual Self-Attention Network for Lightweight Single Image Super-Resolution
    Park, Karam
    Soh, Jae Woong
    Cho, Nam Ik
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 907 - 918
  • [7] SELF-ATTENTION FOR AUDIO SUPER-RESOLUTION
    Rakotonirina, Nathanael Carraz
    [J]. 2021 IEEE 31ST INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2021,
  • [8] SRFormer: Permuted Self-Attention for Single Image Super-Resolution
    Zhou, Yupeng
    Li, Zhen
    Guo, Chun-Le
    Bai, Song
    Cheng, Ming-Ming
    Hou, Qibin
    [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 12734 - 12745
  • [9] Image super-resolution reconstruction based on self-attention GAN
    Wang, Xue-Song
    Chao, Jie
    Cheng, Yu-Hu
    [J]. Kongzhi yu Juece/Control and Decision, 2021, 36 (06): : 1324 - 1332
  • [10] Image super-resolution reconstruction network based on expectation maximization self-attention residual
    Huang, Shuying
    Hu, Hanyang
    Yang, Yong
    Wan, Weiguo
    Wu, Zheng
    [J]. Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2024, 50 (02): : 388 - 397