Average Kullback-Leibler Divergence for Random Finite Sets

被引:0
|
作者
Battistelli, Giorgio [1 ]
Chisci, Luigi [1 ]
Fantacci, Claudio [1 ]
Farina, Alfonso [2 ]
Vo, Ba-Ngu [3 ]
机构
[1] Univ Florence, Dipartimento Ingn Informaz, I-50139 Florence, Italy
[2] Selex ES, I-00131 Rome, Italy
[3] Curtin Univ, Dept Elect & Comp Engn, Bentley, WA 6102, Australia
关键词
multiobject estimation; sensor networks; distributed fusion; random finite sets; consensus; multitarget tracking; CONSENSUS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The paper deals with the fusion of multiobject information over a network of heterogeneous and geographically dispersed nodes with sensing, communication and processing capabilities. To exploit the benefits of sensor networks for multiobject estimation problems, like e.g. multitarget tracking and multirobot SLAM (Simultaneous Localization and Mapping), a key issue to be addressed is how to consistently fuse (average) locally updated multiobject densities. In this paper we discuss the generalization of Kullback-Leibler average, originally conceived for single-object densities (i.e. probability density functions) to (both unlabeled and labeled) multiobject densities. Then, with a view to develop scalable and reliable distributed multiobject estimation algorithms, we review approaches to iteratively compute, in each node of the network, the collective multiobject average via scalable and neighborwise computations.
引用
收藏
页码:1359 / 1366
页数:8
相关论文
共 50 条
  • [1] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [2] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    [J]. JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)
  • [3] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    [J]. ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [4] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    [J]. CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349
  • [5] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    [J]. ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [6] ON WEIGHTED KULLBACK-LEIBLER DIVERGENCE FOR DOUBLY TRUNCATED RANDOM VARIABLES
    Moharana, Rajesh
    Kayal, Suchandan
    [J]. REVSTAT-STATISTICAL JOURNAL, 2019, 17 (03) : 297 - 320
  • [7] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [8] Use of Kullback-Leibler divergence for forgetting
    Karny, Miroslav
    Andrysek, Josef
    [J]. INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 961 - 975
  • [9] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    [J]. NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593
  • [10] Kullback-Leibler Divergence Metric Learning
    Ji, Shuyi
    Zhang, Zizhao
    Ying, Shihui
    Wang, Liejun
    Zhao, Xibin
    Gao, Yue
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (04) : 2047 - 2058