Discovery of Semantic Relationships in PolSAR Images Using Latent Dirichlet Allocation

被引:11
|
作者
Tanase, Radu [1 ,2 ]
Bahmanyar, Reza [3 ]
Schwarz, Gottfried [3 ]
Datcu, Mihai [1 ,3 ]
机构
[1] Univ Politehn Bucuresti, CEOSpaceTech, Res Ctr Spatial Informat, Bucharest 060042, Romania
[2] Mil Tech Acad, Bucharest 050141, Romania
[3] German Aerosp Ctr, Remote Sensing Technol Inst, D-82234 Wessling, Germany
关键词
Bag-of-topics (BoT); bag-of-words (BoW); Entropy/Anisotropy/Alpha-Wishart classification; latent Dirichlet allocation (LDA); polarimetric synthetic aperture radar (PolSAR); semantic relationships; DECOMPOSITION;
D O I
10.1109/LGRS.2016.2636663
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
We propose a multilevel semantics discovery approach for bridging the semantic gap when mining high-resolution polarimetric synthetic aperture radar (PolSAR) remote sensing images. First, an Entropy/Anisotropy/Alpha-Wishart classifier is employed to discover low-level semantics as classes representing the physical scattering properties of targets (e.g., low-entropy/surface scattering/high anisotropy). Then, the images are tiled into patches and each patch is modeled as a bag-of-words, a histogram of the class labels. Next, latent Dirichlet allocation is applied to discover their higher level semantics as a set of topics. Our results demonstrate that topic semantics are close to human semantics used for basic land-cover types (e.g., grassland). Therefore, using the topic description (bag-of-topics) of PolSAR images leads to a narrower semantic gap in image mining. In addition, a visual exploration of the topic descriptions helps to find semantic relationships, which can be used for defining new semantic categories (e.g., mixed land-cover types) and designing rule-based categorization schemes.
引用
收藏
页码:237 / 241
页数:5
相关论文
共 50 条
  • [21] Sparsely labeled coral images segmentation with Latent Dirichlet Allocation
    Yu, Xi
    Bing, Ouyang
    Principe, Jose C.
    Farrington, Stephanie
    Reed, John
    GLOBAL OCEANS 2020: SINGAPORE - U.S. GULF COAST, 2020,
  • [22] Latent Dirichlet allocation
    Blei, DM
    Ng, AY
    Jordan, MI
    JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (4-5) : 993 - 1022
  • [23] Bug localization using latent Dirichlet allocation
    Lukins, Stacy K.
    Kraft, Nicholas A.
    Etzkorn, Letha H.
    INFORMATION AND SOFTWARE TECHNOLOGY, 2010, 52 (09) : 972 - 990
  • [24] Latent Dirichlet allocation
    Blei, DM
    Ng, AY
    Jordan, MI
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 14, VOLS 1 AND 2, 2002, 14 : 601 - 608
  • [25] Author Identification Using Latent Dirichlet Allocation
    Calvo, Hiram
    Hernandez-Castaneda, Angel
    Garcia-Flores, Jorge
    COMPUTATIONAL LINGUISTICS AND INTELLIGENT TEXT PROCESSING, CICLING 2017, PT II, 2018, 10762 : 303 - 312
  • [26] Semantic similarity measure for topic modeling using latent Dirichlet allocation and collapsed Gibbs sampling
    Micheal Olalekan Ajinaja
    Adebayo Olusola Adetunmbi
    Chukwuemeka Christian Ugwu
    Olugbemiga Solomon Popoola
    Iran Journal of Computer Science, 2023, 6 (1) : 81 - 94
  • [27] Phishing detection and impersonated entity discovery using Conditional Random Field and Latent Dirichlet Allocation
    Ramanathan, Venkatesh
    Wechsler, Harry
    COMPUTERS & SECURITY, 2013, 34 : 123 - 139
  • [28] A Temporal Extension of Latent Dirichlet Allocation for Unsupervised Acoustic Unit Discovery
    van der Merwe, Werner
    Kamper, Herman
    du Preez, Johan
    INTERSPEECH 2022, 2022, : 1426 - 1430
  • [29] Feature-Free Explainable Data Mining in SAR Images Using Latent Dirichlet Allocation
    Karmakar, Chandrabali
    Dumitru, Corneliu Octavian
    Schwarz, Gottfried
    Datcu, Mihai
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2021, 14 : 676 - 689
  • [30] A PERCEPTUAL HASHING ALGORITHM USING LATENT DIRICHLET ALLOCATION
    Vretos, Nicholas
    Nikolaidis, Nikos
    Pitas, Ioannis
    ICME: 2009 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOLS 1-3, 2009, : 362 - 365