Average Kullback-Leibler Divergence for Random Finite Sets

被引:0
|
作者
Battistelli, Giorgio [1 ]
Chisci, Luigi [1 ]
Fantacci, Claudio [1 ]
Farina, Alfonso [2 ]
Vo, Ba-Ngu [3 ]
机构
[1] Univ Florence, Dipartimento Ingn Informaz, I-50139 Florence, Italy
[2] Selex ES, I-00131 Rome, Italy
[3] Curtin Univ, Dept Elect & Comp Engn, Bentley, WA 6102, Australia
关键词
multiobject estimation; sensor networks; distributed fusion; random finite sets; consensus; multitarget tracking; CONSENSUS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The paper deals with the fusion of multiobject information over a network of heterogeneous and geographically dispersed nodes with sensing, communication and processing capabilities. To exploit the benefits of sensor networks for multiobject estimation problems, like e.g. multitarget tracking and multirobot SLAM (Simultaneous Localization and Mapping), a key issue to be addressed is how to consistently fuse (average) locally updated multiobject densities. In this paper we discuss the generalization of Kullback-Leibler average, originally conceived for single-object densities (i.e. probability density functions) to (both unlabeled and labeled) multiobject densities. Then, with a view to develop scalable and reliable distributed multiobject estimation algorithms, we review approaches to iteratively compute, in each node of the network, the collective multiobject average via scalable and neighborwise computations.
引用
收藏
页码:1359 / 1366
页数:8
相关论文
共 50 条
  • [21] A generalization of the Kullback-Leibler divergence and its properties
    Yamano, Takuya
    [J]. JOURNAL OF MATHEMATICAL PHYSICS, 2009, 50 (04)
  • [22] Kullback-Leibler divergence for interacting multiple model estimation with random matrices
    Li, Wenling
    Jia, Yingmin
    [J]. IET SIGNAL PROCESSING, 2016, 10 (01) : 12 - 18
  • [23] Computation of Kullback-Leibler Divergence in Bayesian Networks
    Moral, Serafin
    Cano, Andres
    Gomez-Olmedo, Manuel
    [J]. ENTROPY, 2021, 23 (09)
  • [24] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    [J]. 2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [25] Generalization of the Kullback-Leibler divergence in the Tsallis statistics
    Huang, Juntao
    Yong, Wen-An
    Hong, Liu
    [J]. JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2016, 436 (01) : 501 - 512
  • [26] Kullback-Leibler Divergence for Nonnegative Matrix Factorization
    Yang, Zhirong
    Zhang, He
    Yuan, Zhijian
    Oja, Erkki
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 250 - 257
  • [27] Acoustic environment identification by Kullback-Leibler divergence
    Delgado-Gutierrez, G.
    Rodriguez-Santos, F.
    Jimenez-Ramirez, O.
    Vazquez-Medina, R.
    [J]. FORENSIC SCIENCE INTERNATIONAL, 2017, 281 : 134 - 140
  • [28] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    [J]. JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [29] Modulation Classification Based on Kullback-Leibler Divergence
    Im, Chaewon
    Ahn, Seongjin
    Yoon, Dongweon
    [J]. 15TH INTERNATIONAL CONFERENCE ON ADVANCED TRENDS IN RADIOELECTRONICS, TELECOMMUNICATIONS AND COMPUTER ENGINEERING (TCSET - 2020), 2020, : 373 - 376
  • [30] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    [J]. ASTRODYNAMICS 2015, 2016, 156 : 213 - 232