Spectral unmixing using minimum volume constrained Kullback-Leibler divergence

被引:0
|
作者
Mohammed, Salah A. G. [1 ]
Meddeber, Lila [2 ]
Zouagui, Tarik [1 ]
Karoui, Moussa S. [3 ]
机构
[1] Univ Sci & Technol Oran, Embedded Syst & Microsyst Lab, Oran, Algeria
[2] Univ Sci & Technol Oran, Res Lab Intelligent Syst, Oran, Algeria
[3] Ctr Tech Spatiales, Arzew, Algeria
关键词
spectral unmixing; hyperspectral imaging; linear mixing model; Kullback-Leibler; nonnegative matrix factorization; FAST ALGORITHM;
D O I
10.1117/1.JRS.14.024511
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Spectral unmixing (SU) has been a subject of particular attention in the hyperspectral imaging literature. Most SU algorithms are based on the linear mixing model (LMM), which assumes that one pixel of the image is the linear combination of a given number of pure spectra called endmembers, weighted by their coefficients called abundances. SU is a technique to identify these endmembers and their relative abundances. We present an LMM approach based on nonnegative matrix factorization, combining the minimum volume constraint (MVC) and Kullback-Leibler (KL) divergence referred to as KL-MVC. The proposed method is evaluated using synthetic images with different noise levels and real images with different methods of initialization, and high performance has been achieved compared with the widely used LMM-based methods. (C) 2020 Society of Photo-Optical Instrumentation Engineers (SPIE)
引用
收藏
页数:16
相关论文
共 50 条
  • [31] Human promoter recognition using kullback-leibler divergence
    Zeng, Ja
    Cao, Xiao-Qin
    Yan, Hong
    PROCEEDINGS OF 2007 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2007, : 3319 - 3325
  • [32] Constrained Extended Kalman Filter based on Kullback-Leibler (KL) Divergence
    Li, Ruoxia
    Jan, Nabil Magbool
    Prasad, Vinay
    Huang, Biao
    2018 EUROPEAN CONTROL CONFERENCE (ECC), 2018, : 831 - 836
  • [33] A generalization of the Kullback-Leibler divergence and its properties
    Yamano, Takuya
    JOURNAL OF MATHEMATICAL PHYSICS, 2009, 50 (04)
  • [34] Computation of Kullback-Leibler Divergence in Bayesian Networks
    Moral, Serafin
    Cano, Andres
    Gomez-Olmedo, Manuel
    ENTROPY, 2021, 23 (09)
  • [35] Fault detection in dynamic systems using the Kullback-Leibler divergence
    Xie, Lei
    Zeng, Jiusun
    Kruger, Uwe
    Wang, Xun
    Geluk, Jaap
    CONTROL ENGINEERING PRACTICE, 2015, 43 : 39 - 48
  • [36] Conic reformulations for Kullback-Leibler divergence constrained distributionally robust optimization and applications
    Kocuk, Burak
    INTERNATIONAL JOURNAL OF OPTIMIZATION AND CONTROL-THEORIES & APPLICATIONS-IJOCTA, 2021, 11 (02): : 139 - 151
  • [37] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [38] Generalization of the Kullback-Leibler divergence in the Tsallis statistics
    Huang, Juntao
    Yong, Wen-An
    Hong, Liu
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2016, 436 (01) : 501 - 512
  • [39] Estimation of discrepancy of color qualia using Kullback-Leibler divergence
    Yamada, Miku
    Matsumoto, Miu
    Arakaki, Mina
    Hebishima, Hana
    Inage, Shinichi
    BIOSYSTEMS, 2023, 232
  • [40] Robust Active Stereo Vision Using Kullback-Leibler Divergence
    Wang, Yongchang
    Liu, Kai
    Hao, Qi
    Wang, Xianwang
    Lau, Daniel L.
    Hassebrook, Laurence G.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2012, 34 (03) : 548 - 563