A Method of Aerial Multi-Modal Image Registration for a Low-Visibility Approach Based on Virtual Reality Fusion

被引:3
|
作者
Wu, Yuezhou [1 ]
Liu, Changjiang [2 ]
机构
[1] Civil Aviat Flight Univ China, Sch Comp Sci, Guanghan 618307, Peoples R China
[2] Sichuan Univ Sci & Engn, Key Lab Higher Educ Sichuan Prov Enterprise Inform, Zigong 643000, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 06期
基金
国家重点研发计划;
关键词
infrared image; multi-modal images; image registration; image fusion; EVS (enhanced vision system);
D O I
10.3390/app13063396
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Aiming at the approach and landing of an aircraft under low visibility, this paper studies the use of an infrared heat-transfer imaging camera and visible-light camera to obtain dynamic hyperspectral images of flight approach scenes from the perspective of enhancing pilot vision. Aiming at the problems of affine deformation, difficulty in extracting similar geometric features, thermal shadows, light shadows, and other issues in heterogenous infrared and visible-light image registration, a multi-modal image registration method based on RoI driving in a virtual scene, RoI feature extraction, and virtual-reality-fusion-based contour angle orientation is proposed, and this could reduce the area to be registered, reduces the amount of computation, and improves the real-time registration accuracy. Aiming at the differences in multi-modal image fusion in terms of resolution, contrast, color channel, color information strength, and other aspects, the contour angle orientation maintains the geometric deformation of multi-source images well, and the virtual reality fusion technology effectively deletes incorrectly matched point pairs. By integrating redundant information and complementary information from multi-modal images, the visual perception abilities of pilots during the approach process are enhanced as a whole.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Morphological Pyramid Multi-modal Medical Image Registration Based on QPSO
    Huang, Xiaosheng
    Zhang, Fang
    2009 INTERNATIONAL ASIA SYMPOSIUM ON INTELLIGENT INTERACTION AND AFFECTIVE COMPUTING, 2009, : 67 - 70
  • [32] Multi-Modal Medical Image Fusion Using Transfer Learning Approach
    Kalamkar, Shrida
    Mary, Geetha A.
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (12) : 483 - 488
  • [33] A Discrete Search Method for Multi-modal Non-Rigid Image Registration
    Shekhovtsov, Alexander
    Garcia-Arteaga, Juan D.
    Werner, Tomas
    2008 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, VOLS 1-3, 2008, : 915 - 920
  • [34] Multi-modal Perception Fusion Method Based on Cross Attention
    Zhang B.-L.
    Pan Z.-H.
    Jiang J.-Z.
    Zhang C.-B.
    Wang Y.-X.
    Yang C.-L.
    Zhongguo Gonglu Xuebao/China Journal of Highway and Transport, 2024, 37 (03): : 181 - 193
  • [35] Visual Sorting Method Based on Multi-Modal Information Fusion
    Han, Song
    Liu, Xiaoping
    Wang, Gang
    APPLIED SCIENCES-BASEL, 2022, 12 (06):
  • [36] Evaluation Method of Teaching Styles Based on Multi-modal Fusion
    Tang, Wen
    Wang, Chongwen
    Zhang, Yi
    2021 THE 7TH INTERNATIONAL CONFERENCE ON COMMUNICATION AND INFORMATION PROCESSING, ICCIP 2021, 2021, : 9 - 15
  • [37] MFHOD: Multi-modal image fusion method based on the higher-order degradation model
    Guo, Jinxin
    Zhan, Weida
    Jiang, Yichun
    Ge, Wei
    Chen, Yu
    Xu, Xiaoyu
    Li, Jin
    Liu, Yanyan
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 249
  • [38] Multi-modal image fusion based on saliency guided in NSCT domain
    Wang, Shiying
    Shen, Yan
    IET IMAGE PROCESSING, 2020, 14 (13) : 3188 - 3201
  • [39] Leveraging multi-modal fusion for graph-based image annotation
    Amiri, S. Hamid
    Jamzad, Mansour
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2018, 55 : 816 - 828
  • [40] EVolution: an edge-based variational method for non-rigid multi-modal image registration
    de Senneville, B. Denis
    Zachiu, C.
    Ries, M.
    Moonen, C.
    PHYSICS IN MEDICINE AND BIOLOGY, 2016, 61 (20): : 7377 - 7396