Spectral unmixing using minimum volume constrained Kullback-Leibler divergence

被引:0
|
作者
Mohammed, Salah A. G. [1 ]
Meddeber, Lila [2 ]
Zouagui, Tarik [1 ]
Karoui, Moussa S. [3 ]
机构
[1] Univ Sci & Technol Oran, Embedded Syst & Microsyst Lab, Oran, Algeria
[2] Univ Sci & Technol Oran, Res Lab Intelligent Syst, Oran, Algeria
[3] Ctr Tech Spatiales, Arzew, Algeria
关键词
spectral unmixing; hyperspectral imaging; linear mixing model; Kullback-Leibler; nonnegative matrix factorization; FAST ALGORITHM;
D O I
10.1117/1.JRS.14.024511
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Spectral unmixing (SU) has been a subject of particular attention in the hyperspectral imaging literature. Most SU algorithms are based on the linear mixing model (LMM), which assumes that one pixel of the image is the linear combination of a given number of pure spectra called endmembers, weighted by their coefficients called abundances. SU is a technique to identify these endmembers and their relative abundances. We present an LMM approach based on nonnegative matrix factorization, combining the minimum volume constraint (MVC) and Kullback-Leibler (KL) divergence referred to as KL-MVC. The proposed method is evaluated using synthetic images with different noise levels and real images with different methods of initialization, and high performance has been achieved compared with the widely used LMM-based methods. (C) 2020 Society of Photo-Optical Instrumentation Engineers (SPIE)
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [2] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)
  • [3] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [4] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349
  • [5] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [6] Constrained ensemble Kalman filter based on Kullback-Leibler divergence
    Li, Ruoxia
    Jan, Nabil Magbool
    Huang, Biao
    Prasad, Vinay
    JOURNAL OF PROCESS CONTROL, 2019, 81 : 150 - 161
  • [7] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [8] Kullback-Leibler Divergence Metric Learning
    Ji, Shuyi
    Zhang, Zizhao
    Ying, Shihui
    Wang, Liejun
    Zhao, Xibin
    Gao, Yue
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (04) : 2047 - 2058
  • [9] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593
  • [10] DETECTING RARE EVENTS USING KULLBACK-LEIBLER DIVERGENCE
    Xu, Jingxin
    Denman, Simon
    Fookes, Clinton
    Sridharan, Sridha
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 1305 - 1309