Multi-focus image fusion method using S-PCNN optimized by particle swarm optimization

被引:33
|
作者
Jin, Xin [1 ]
Zhou, Dongming [1 ]
Yao, Shaowen [2 ]
Nie, Rencan [1 ]
Jiang, Qian [1 ]
He, Kangjian [1 ]
Wang, Quan [1 ]
机构
[1] Yunnan Univ, Sch Informat, Cuihu Rd, Kunming 650091, Yunnan, Peoples R China
[2] Yunnan Univ, Sch Software, Cuihu Rd, Kunming 650091, Yunnan, Peoples R China
基金
中国国家自然科学基金;
关键词
Image processing; Image fusion; Simplified pulse-coupled neural networks; Particle swarm optimization; Feature extraction; COUPLED NEURAL-NETWORK; NONSUBSAMPLED CONTOURLET TRANSFORM; SELECTION; COLOR; MODEL;
D O I
10.1007/s00500-017-2694-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proposed a novel image fusion method based on simplified pulse-coupled neural network (S-PCNN), particle swarm optimization (PSO) and block image processing method. In general, the parameters of S-PCNN are set manually, which is complex and time-consuming and usually causes inconsistence. In this paper, the parameters of S-PCNN are set by PSO algorithm to overcome these shortcomings and improve fusion performance. Firstly, source images are divided into several equidimension sub-blocks, and then, spatial frequency is calculated as the characteristic factor of the sub-block to get the whole source image's characterization factor matrix (CFM), and by this way the operand can be effectively reduced. Secondly, S-PCNN is used for the analysis of the CFM to get its oscillation frequency graph (OFG). Thirdly, the fused CFM will be got according to the OFG. Finally, the fused image will be reconstructed according to the fused CFM and block rule. In this process, the parameters of S-PCNN are set by PSO algorithm to get the best fusion effect. By CFM and block method, the operand of the proposed method will be effectively reduced. The experiments indicate that the multi-focus image fusion algorithm is more efficient than other traditional image fusion algorithms, and it proves that the automatically parameters setting method is effective as well.
引用
收藏
页码:6395 / 6407
页数:13
相关论文
共 50 条
  • [31] Multi-Focus Image Fusion Using Fuzzy Logic
    Chamankar, Amaj
    Sheikhan, Mansour
    Razaghian, Farhad
    2013 13TH IRANIAN CONFERENCE ON FUZZY SYSTEMS (IFSC), 2013,
  • [32] Multi-focus image fusion using fractal dimension
    Panigrahy, Chinmaya
    Seal, Ayan
    Mahato, Nihar Kumar
    Krejcar, Ondrej
    Herrera-Viedma, Enrique
    APPLIED OPTICS, 2020, 59 (19) : 5642 - 5655
  • [33] Image registration for multi-focus image fusion
    Zhang, Z
    Blum, RS
    BATTLESPACE DIGITIZATION AND NETWORK-CENTRIC WARFARE, 2001, 4396 : 279 - 290
  • [34] Improved Multi-Focus Image Fusion
    Jameel, Amina
    Noor, Fouzia
    2015 18TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2015, : 1346 - 1352
  • [35] A Multi-focus Image Fusion Classifier
    Siddiqui, Abdul Basit
    Rashid, Muhammad
    Jaffar, M. Arfan
    Hussain, Ayyaz
    Mirza, Anwar M.
    INFORMATION-AN INTERNATIONAL INTERDISCIPLINARY JOURNAL, 2012, 15 (04): : 1757 - 1764
  • [36] Multi-focus thermal image fusion
    Benes, Radek
    Dvorak, Pavel
    Faundez-Zanuy, Marcos
    Espinosa-Duro, Virginia
    Mekyska, Jiri
    PATTERN RECOGNITION LETTERS, 2013, 34 (05) : 536 - 544
  • [37] Multi-focus Image Fusion Algorithm Based On Prewitt Edge Detect Information Motivated PCNN
    Du, Chao-ben
    Gao, She-sheng
    PROCEEDINGS OF THE 2017 INTERNATIONAL CONFERENCE ON MANUFACTURING ENGINEERING AND INTELLIGENT MATERIALS (ICMEIM 2017), 2017, 100 : 223 - 228
  • [38] Fractal dimension based parameter adaptive dual channel PCNN for multi-focus image fusion
    Panigrahy, Chinmaya
    Seal, Ayan
    Mahato, Nihar Kumar
    OPTICS AND LASERS IN ENGINEERING, 2020, 133
  • [39] Multi-focus Image Fusion Method Based on NSST and IICM
    Lei, Yang
    ADVANCES IN INTERNETWORKING, DATA & WEB TECHNOLOGIES, EIDWT-2017, 2018, 6 : 679 - 689
  • [40] A multi-focus image fusion method based on wavelet transform
    Yang, Shen
    Deng, Ai
    Journal of Computational Information Systems, 2010, 6 (03): : 839 - 846