Low-light image enhancement for infrared and visible image fusion

被引:1
|
作者
Zhou, Yiqiao [1 ]
Xie, Lisiqi [1 ]
He, Kangjian [1 ]
Xu, Dan [1 ,3 ]
Tao, Dapeng [1 ]
Lin, Xu [2 ]
机构
[1] Yunnan Univ, Sch Informat Sci & Engn, Kunming, Peoples R China
[2] Yunnan Union Vis Innovat Technol Co Ltd, Kunming, Peoples R China
[3] Yunnan Univ, Sch Informat Sci & Engn, Kunming 650091, Peoples R China
基金
中国国家自然科学基金;
关键词
image denoising; image enhancement; image fusion; INFORMATION; NEST;
D O I
10.1049/ipr2.12857
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Infrared and visible image fusion (IVIF) is an essential branch of image fusion, and enhancing the visible image of IVIF can significantly improve the fusion performance. However, many existing low-light enhancement methods are unsuitable for the visible image enhancement of IVIF. In order to solve this problem, this paper proposes a new visible image enhancement method for IVIF. Firstly, the colour balance and contrast enhancement-based self-calibrated illumination estimation (CCSCE) is proposed to improve the input image's brightness, contrast, and colour information. Then, the method based on Mutually Guided Image Filtering (muGIF) is adopted to design a strategy to extract details adaptively from the original visible image, which can keep details without introducing additional noise effectively. Finally, the proposed visible image enhancement technique is used for IVIF tasks. In addition, the proposed method can be used for the visible image enhancement of IVIF and other low-light images. Experiment results on different public datasets and IVIF demonstrate the authors' method's superiority from both qualitative and quantitative comparisons. The authors' code will be publicly available at .
引用
收藏
页码:3216 / 3234
页数:19
相关论文
共 50 条
  • [1] LENFusion: A Joint Low-Light Enhancement and Fusion Network for Nighttime Infrared and Visible Image Fusion
    Chen, Jun
    Yang, Liling
    Liu, Wei
    Tian, Xin
    Ma, Jiayi
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 15
  • [2] Research on fusion technology based on low-light visible image and infrared image
    Liu, Shuo
    Piao, Yan
    Tahir, Muhammad
    OPTICAL ENGINEERING, 2016, 55 (12)
  • [3] Contrast Enhanced Low-light Visible and Infrared Image Fusion
    Teku, Sandhya Kumari
    Rao, S. Koteswara
    Prabha, I. Santhi
    DEFENCE SCIENCE JOURNAL, 2016, 66 (03) : 266 - 271
  • [4] EV-Fusion: A Novel Infrared and Low-Light Color Visible Image Fusion Network Integrating Unsupervised Visible Image Enhancement
    Zhang, Xin
    Wang, Xia
    Yan, Changda
    Sun, Qiyang
    IEEE SENSORS JOURNAL, 2024, 24 (04) : 4920 - 4934
  • [5] Infrared and visible image fusion via salient object extraction and low-light region enhancement
    Liu, Yaochen
    Dong, Lili
    Xu, Wenhai
    INFRARED PHYSICS & TECHNOLOGY, 2022, 124
  • [6] AN IMPROVED FUSION METHOD FOR INFRARED AND LOW-LIGHT LEVEL VISIBLE IMAGE
    Wu, Ruiqing
    Yu, Dayan
    Liu, Jian
    Wu, Hao
    Chen, Wei
    Gu, Qingshui
    2017 14TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICCWAMTIP), 2017, : 147 - 151
  • [7] L2FUSION: LOW-LIGHT ORIENTED INFRARED AND VISIBLE IMAGE FUSION
    Gao, Xiang
    Lv, Guohua
    Dong, Aimei
    Wei, Zhonghe
    Cheng, Jinyong
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 2405 - 2409
  • [8] Low-Light Infrared and Visible Image Fusion with Imbalanced Thermal Radiation Distribution
    Lei, Xinyu
    Liu, Longjun
    Jia, Puhang
    Li, Haoteng
    Zhang, Haonan
    IEEE Transactions on Instrumentation and Measurement, 2024, 73
  • [9] Fusion-Based Low-Light Image Enhancement
    Wang, Haodian
    Wang, Yang
    Cao, Yang
    Zha, Zheng-Jun
    MULTIMEDIA MODELING, MMM 2023, PT I, 2023, 13833 : 121 - 133
  • [10] An interactive deep model combined with Retinex for low-light visible and infrared image fusion
    Wang, Changcheng
    Zang, Yongsheng
    Zhou, Dongming
    Nie, Rencan
    Mei, Jiatian
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (16): : 11733 - 11751