Spectral unmixing using minimum volume constrained Kullback-Leibler divergence

被引:0
|
作者
Mohammed, Salah A. G. [1 ]
Meddeber, Lila [2 ]
Zouagui, Tarik [1 ]
Karoui, Moussa S. [3 ]
机构
[1] Univ Sci & Technol Oran, Embedded Syst & Microsyst Lab, Oran, Algeria
[2] Univ Sci & Technol Oran, Res Lab Intelligent Syst, Oran, Algeria
[3] Ctr Tech Spatiales, Arzew, Algeria
关键词
spectral unmixing; hyperspectral imaging; linear mixing model; Kullback-Leibler; nonnegative matrix factorization; FAST ALGORITHM;
D O I
10.1117/1.JRS.14.024511
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Spectral unmixing (SU) has been a subject of particular attention in the hyperspectral imaging literature. Most SU algorithms are based on the linear mixing model (LMM), which assumes that one pixel of the image is the linear combination of a given number of pure spectra called endmembers, weighted by their coefficients called abundances. SU is a technique to identify these endmembers and their relative abundances. We present an LMM approach based on nonnegative matrix factorization, combining the minimum volume constraint (MVC) and Kullback-Leibler (KL) divergence referred to as KL-MVC. The proposed method is evaluated using synthetic images with different noise levels and real images with different methods of initialization, and high performance has been achieved compared with the widely used LMM-based methods. (C) 2020 Society of Photo-Optical Instrumentation Engineers (SPIE)
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [42] Acoustic environment identification by Kullback-Leibler divergence
    Delgado-Gutierrez, G.
    Rodriguez-Santos, F.
    Jimenez-Ramirez, O.
    Vazquez-Medina, R.
    FORENSIC SCIENCE INTERNATIONAL, 2017, 281 : 134 - 140
  • [43] Kullback-Leibler Divergence for Nonnegative Matrix Factorization
    Yang, Zhirong
    Zhang, He
    Yuan, Zhijian
    Oja, Erkki
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 250 - 257
  • [44] Modulation Classification Based on Kullback-Leibler Divergence
    Im, Chaewon
    Ahn, Seongjin
    Yoon, Dongweon
    15TH INTERNATIONAL CONFERENCE ON ADVANCED TRENDS IN RADIOELECTRONICS, TELECOMMUNICATIONS AND COMPUTER ENGINEERING (TCSET - 2020), 2020, : 373 - 376
  • [45] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    ASTRODYNAMICS 2015, 2016, 156 : 213 - 232
  • [46] Local inconsistency detection using the Kullback-Leibler divergence measure
    Spineli, Loukia M.
    SYSTEMATIC REVIEWS, 2024, 13 (01)
  • [47] Minimax Regret on Patterns Using Kullback-Leibler Divergence Covering
    Tang, Jennifer
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [48] AN EFFECTIVE IMAGE RESTORATION USING KULLBACK-LEIBLER DIVERGENCE MINIMIZATION
    Hanif, Muhammad
    Seghouane, Abd-Krim
    2014 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2014, : 4522 - 4526
  • [49] Vector-quantization by density matching in the minimum Kullback-Leibler divergence sense
    Hegde, A
    Erdogmus, D
    Lehn-Schioler, T
    Rao, YN
    Principe, JC
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 105 - 109
  • [50] Distributed Fusion of Multiple Model Estimators Using Minimum Forward Kullback-Leibler Divergence Sum
    Wei, Zheng
    Duan, Zhansheng
    Hanebeck, Uwe D.
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2024, 60 (03) : 2934 - 2947