SaReGAN: a salient regional generative adversarial network for visible and infrared image fusion

被引:1
|
作者
Gao, Mingliang [1 ]
Zhou, Yi'nan [2 ]
Zhai, Wenzhe [1 ]
Zeng, Shuai [3 ]
Li, Qilei [4 ]
机构
[1] Shandong Univ Technol, Coll Elect & Elect Engn, Zibo 255000, Shandong, Peoples R China
[2] Genesis AI Lab, Futong Technol, Chengdu 610054, Peoples R China
[3] Sichuan Univ, West China Univ Hosp 2, Dept Obstet & Gynaecol, Chengdu, Sichuan, Peoples R China
[4] Queen Mary Univ London, Sch Elect Engn & Comp Sci, London E1 4NS, England
关键词
Smart city; Image fusion; Visible and infrared image; Generative adversarial network; Salient region; PERFORMANCE;
D O I
10.1007/s11042-023-14393-2
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multispectral image fusion plays a crucial role in smart city environment safety. In the domain of visible and infrared image fusion, object vanishment after fusion is a key problem which restricts the fusion performance. To address this problem, a novel Salient Regional Generative Adversarial Network GAN (SaReGAN) is presented for infrared and VIS image fusion. The SaReGAN consists of three parts. In the first part, the salient regions of infrared image are extracted by visual saliency map and the information of these regions is preserved. In the second part, the VIS image, infrared image and salient information are merged thoroughly in the generator to gain a pre-fused image. In the third part, the discriminator attempts to differentiate the pre-fused image and VIS image, in order to learn details from VIS image based on the adversarial mechanism. Experimental results verify that the SaReGAN outperforms other state-of-the-art methods in quantitative and qualitative evaluations.
引用
收藏
页码:61659 / 61671
页数:13
相关论文
共 50 条
  • [1] Infrared and visible image fusion using salient decomposition based on a generative adversarial network
    Chen, Lei
    Han, Jun
    [J]. APPLIED OPTICS, 2021, 60 (23) : 7017 - 7026
  • [2] FusionGAN: A generative adversarial network for infrared and visible image fusion
    Ma, Jiayi
    Yu, Wei
    Liang, Pengwei
    Li, Chang
    Jiang, Junjun
    [J]. INFORMATION FUSION, 2019, 48 : 11 - 26
  • [3] Infrared and Visible Image Fusion with a Generative Adversarial Network and a Residual Network
    Xu, Dongdong
    Wang, Yongcheng
    Xu, Shuyan
    Zhu, Kaiguang
    Zhang, Ning
    Zhang, Xin
    [J]. APPLIED SCIENCES-BASEL, 2020, 10 (02):
  • [4] Laplacian Pyramid Generative Adversarial Network for Infrared and Visible Image Fusion
    Yin, Haitao
    Xiao, Jinghu
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1988 - 1992
  • [5] MAGAN: Multiattention Generative Adversarial Network for Infrared and Visible Image Fusion
    Huang, Shuying
    Song, Zixiang
    Yang, Yong
    Wan, Weiguo
    Kong, Xiangkai
    [J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [6] DFPGAN: Dual fusion path generative adversarial network for infrared and visible image fusion
    Yi, Shi
    Li, Junjie
    Yuan, Xuesong
    [J]. INFRARED PHYSICS & TECHNOLOGY, 2021, 119
  • [7] Infrared and visible image fusion with improved residual dense generative adversarial network
    Min, Li
    Cao, Si-Jian
    Zhao, Huai-Ci
    Liu, Peng-Fei
    Tai, Bing-Chang
    [J]. Kongzhi yu Juece/Control and Decision, 2023, 38 (03): : 721 - 728
  • [8] Infrared and visible image fusion based on Laplacian pyramid and generative adversarial network
    Wang, Juan
    Ke, Cong
    Wu, Minghu
    Liu, Min
    Zeng, Chunyan
    [J]. KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2021, 15 (05): : 1761 - 1777
  • [9] Infrared and Visible Image Fusion via Texture Conditional Generative Adversarial Network
    Yang, Yong
    Liu, Jiaxiang
    Huang, Shuying
    Wan, Weiguo
    Wen, Wenying
    Guan, Juwei
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2021, 31 (12) : 4771 - 4783
  • [10] Infrared and Visible Image Fusion Based on Blur Suppression Generative Adversarial Network
    Yi, Shi
    Liu, Xi
    Li, Li
    Cheng, Xinghao
    Wang, Cheng
    [J]. CHINESE JOURNAL OF ELECTRONICS, 2023, 32 (01) : 177 - 188