FSMI: Fast computation of Shannon Mutual Information for information-theoretic mapping

被引:2
|
作者
Zhang, Thengdong [1 ]
Henderson, Trevor [1 ]
Sze, Vivienne [1 ]
Karaman, Sertac [1 ]
机构
[1] MIT, Cambridge, MA 02139 USA
关键词
EXPLORATION;
D O I
10.1109/icra.2019.8793541
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Information-based mapping algorithms are critical to robot exploration tasks in several applications ranging from disaster response to space exploration. Unfortunately, most existing information-based mapping algorithms are plagued by the computational difficulty of evaluating the Shannon mutual information between potential future sensor measurements and the map. This has lead researchers to develop approximate methods, such as Cauchy-Schwarz Quadratic Mutual Information (CSQMI). In this paper, we propose a new algorithm, called Fast Shannon Mutual Information (FSMI), which is significantly faster than existing methods at computing the exact Shannon mutual information. The key insight behind FSMI is recognizing that the integral over the sensor beam can be evaluated analytically, removing an expensive numerical integration. In addition, we provide a number of approximation techniques for FSMI, which significantly improve computation time. Equipped with these approximation techniques, the FSMI algorithm is more than three orders of magnitude faster than the existing computation for Shannon mutual information; it also outperforms the CSQMI algorithm significantly, being roughly twice as fast, in our experiments.
引用
收藏
页码:6912 / 6918
页数:7
相关论文
共 50 条
  • [1] FSMI: Fast computation of Shannon mutual information for information-theoretic mapping
    Zhang, Zhengdong
    Henderson, Theia
    Karaman, Sertac
    Sze, Vivienne
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2020, 39 (09): : 1155 - 1177
  • [2] CRMI: Confidence-Rich Mutual Information for Information-Theoretic Mapping
    Xu, Yang
    Zheng, Ronghao
    Liu, Meiqin
    Zhang, Senlin
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (04) : 6434 - 6441
  • [3] A Fast Information-Theoretic Approximation of Joint Mutual Information Feature Selection
    Liu, Heng
    Ditzler, Gregory
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 4610 - 4617
  • [4] Information-Theoretic Mapping Using Cauchy-Schwarz Quadratic Mutual Information
    Charrow, Benjamin
    Liu, Sikang
    Kumar, Vijay
    Michael, Nathan
    2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2015, : 4791 - 4798
  • [5] Semantic OcTree Mapping and Shannon Mutual Information Computation for Robot Exploration
    Asgharivaskasi, Arash
    Atanasov, Nikolay
    IEEE TRANSACTIONS ON ROBOTICS, 2023, 39 (03) : 1910 - 1928
  • [6] Meeting Shannon: Information-theoretic thinking in engineering and science
    Varshney, Lav R.
    2008 IEEE INFORMATION THEORY WORKSHOP, 2008, : 5 - 5
  • [7] Fast Information-theoretic Bayesian Optimisation
    Ru, Binxin
    McLeod, Mark
    Granziol, Diego
    Osborne, Michael A.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [8] Information-Theoretic Lower Bounds for Distributed Function Computation
    Xu, Aolin
    Raginsky, Maxim
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (04) : 2314 - 2337
  • [9] Information-theoretic temporal Bell inequality and quantum computation
    Morikoshi, Fumiaki
    PHYSICAL REVIEW A, 2006, 73 (05)
  • [10] On Information-Theoretic Secure Multiparty Computation with Local Repairability
    Escudero, Daniel
    Tjuawinata, Ivan
    Xing, Chaoping
    PUBLIC-KEY CRYPTOGRAPHY, PT II, PKC 2024, 2024, 14602 : 205 - 239