Deep learning network for fusing optical and infrared images in a complex imaging environment by using the modified U-Net

被引:0
|
作者
Xiang, Bing-Quan [1 ]
Pan, Chao [2 ,3 ]
Liu, Jin [1 ]
机构
[1] Wuhan Univ Sci & Technol, Sch Informat Sci & Engn, Wuhan 430081, Peoples R China
[2] Hubei Univ Econ, Sch Informat Engn, Wuhan 430205, Peoples R China
[3] Hubei Univ Econ, Hubei Internet Finance Informat Engn Technol Res C, Wuhan 430205, Peoples R China
基金
中国国家自然科学基金;
关键词
CNN;
D O I
10.1364/JOSAA.492002
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
The fusion of optical and infrared images is a critical task in the field of image processing. However, it is challenging to achieve optimal results when fusing images from complex environments. In this paper, we propose a deep learning network model comprising an encoding network and a decoding network based on the modified U-Net network to fuse low-quality images from complex imaging environments. As both encoding and decoding networks use similar convolutional modules, they can share similar layer structures to improve the overall fusion performance. Furthermore, an attention mechanism module is integrated into the decoding network to identify and capture the crucial features of the fused images. It can assist the deep learning network to extract more relevant image features and thus get more accurate fusion. The proposed model has been compared with some existing methods to prove its performance in view of subjective and objective evaluations. (c) 2023 Optica Publishing Group
引用
收藏
页码:1644 / 1653
页数:10
相关论文
共 50 条
  • [21] Deep Learning Based Channel Estimation for UAVs: A Modified U-Net Approach
    Gupta, Chirag
    Yadav, Satyendra Singh
    ADVANCES IN ELECTRICAL AND COMPUTER ENGINEERING, 2025, 25 (01) : 61 - 70
  • [22] Lymph node detection in CT scans using modified U-Net with residual learning and 3D deep network
    Yashwanth Manjunatha
    Vanshali Sharma
    Yuji Iwahori
    M. K. Bhuyan
    Aili Wang
    Akira Ouchi
    Yasuhiro Shimizu
    International Journal of Computer Assisted Radiology and Surgery, 2023, 18 : 723 - 732
  • [23] Lymph node detection in CT scans using modified U-Net with residual learning and 3D deep network
    Manjunatha, Yashwanth
    Sharma, Vanshali
    Iwahori, Yuji
    Bhuyan, M. K.
    Wang, Aili
    Ouchi, Akira
    Shimizu, Yasuhiro
    INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, 2023, 18 (04) : 723 - 732
  • [24] An encoder-decoder and modified U-Net network for microwave imaging of stroke
    Liu, Jinzhen
    Chen, Liming
    Xiong, Hui
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2024, 34 (02)
  • [25] Segmentation of Nano-Particles from SEM Images Using Transfer Learning and Modified U-Net
    Sanan, V. Sowmya
    Isaac, R. S. Rimal
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2025, 16 (01) : 662 - 677
  • [26] Segmentation of Activated Sludge Phase Contrast Microscopy Images Using U-Net Deep Learning Model
    Zhao, Li-Jie
    Zou, Shi-Da
    Zhang, Yu-Hong
    Huang, Ming-Zhong
    Zuo, Yue
    Wang, Jia
    Lu, Xing-Kui
    Wu, Zhi-Hao
    Liu, Xiang-Yu
    SENSORS AND MATERIALS, 2019, 31 (06) : 2013 - 2028
  • [27] Surface Defect Detection Using Deep U-Net Network Architectures
    Uzen, Huseyin
    Turkoglu, Muammer
    Hanbay, Davut
    29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
  • [28] Lunar ground segmentation using a modified U-net neural network
    Petrakis, Georgios
    Partsinevelos, Panagiotis
    MACHINE VISION AND APPLICATIONS, 2024, 35 (03)
  • [29] Lightweight Deep Learning Model, ConvNeXt-U: An Improved U-Net Network for Extracting Cropland in Complex Landscapes from Gaofen-2 Images
    Liu, Shukuan
    Cao, Shi
    Lu, Xia
    Peng, Jiqing
    Ping, Lina
    Fan, Xiang
    Teng, Feiyu
    Liu, Xiangnan
    SENSORS, 2025, 25 (01)
  • [30] Seismic data fault detection based on U-Net deep learning network
    Yang W.
    Yang J.
    Chen S.
    Kuang L.
    Wang E.
    Zhou C.
    Shiyou Diqiu Wuli Kantan/Oil Geophysical Prospecting, 2021, 56 (04): : 688 - 697