An information fractal dimensional relative entropy

被引:1
|
作者
Wu, Jingyou [1 ]
机构
[1] Univ Elect Sci & Technol China, Inst Fac Math Sci & Fundamental & Frontier Sci, Chengdu 610054, Peoples R China
基金
中国国家自然科学基金;
关键词
Computational complexity - Probability distributions;
D O I
10.1063/5.0189038
中图分类号
TB3 [工程材料学];
学科分类号
0805 ; 080502 ;
摘要
Shannon entropy is used to measure information uncertainty, while the information dimension is used to measure information complexity. Given two probability distributions, the difference can be measured by relative entropy. However, the existing relative entropy does not consider the effect of information dimension. To improve the existing entropy, a new relative entropy is presented in this paper. The information fractal dimension is considered in the proposed relative entropy. The new relative entropy is more generalized than the initial relative entropy. When dimension is not considered, it will degenerate to the initial relative entropy. Another interesting point is that the new relative entropy may have negative values when calculating. The physical meaning is still under exploration. Finally, some application examples are provided to exemplify the utilization of the proposed relative entropy. (c) 2024 Author(s).
引用
收藏
页数:6
相关论文
共 50 条
  • [31] Relative entropy equals bulk relative entropy
    Daniel L. Jafferis
    Aitor Lewkowycz
    Juan Maldacena
    S. Josephine Suh
    Journal of High Energy Physics, 2016
  • [32] Relative entropy equals bulk relative entropy
    Jafferis, Daniel L.
    Lewkowycz, Aitor
    Maldacena, Juan
    Suh, S. Josephine
    JOURNAL OF HIGH ENERGY PHYSICS, 2016, (06):
  • [33] Nonconvexity of the relative entropy for Markov dynamics: A Fisher information approach
    Polettini, Matteo
    Esposito, Massimiliano
    PHYSICAL REVIEW E, 2013, 88 (01):
  • [34] Robustness via a tradeoff between Fisher information and relative entropy
    Li, Lichun
    O'Sullivan, Joseph A.
    2007 IEEE/SP 14TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2007, : 239 - 243
  • [35] Relative entropy information- theoretic approach to multiscale modeling
    Shell, M. Scott
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2016, 251
  • [36] Eukaryotic promoter prediction based on relative entropy and positional information
    Wu, Shuanhu
    Xie, Xudong
    Liew, Alan Wee-Chung
    Yan, Hong
    PHYSICAL REVIEW E, 2007, 75 (04)
  • [37] A complementary inequality to the information monotonicity for Tsallis relative operator entropy
    Moradi, Hamid Reza
    Furuichi, Shigeru
    LINEAR & MULTILINEAR ALGEBRA, 2020, 68 (08): : 1557 - 1567
  • [38] SOME PROPERTIES OF THE RELATIVE ENTROPY DENSITY OF ARBITRARY INFORMATION SOURCE
    LIU, W
    CHINESE SCIENCE BULLETIN, 1990, 35 (11): : 888 - 892
  • [39] Relative Entropy Sorting Method Based on the Preference Information of Alternative
    Yang Yi
    Lian Bizhan
    Wang Shiwen
    Gao Jianwei
    PROCEEDINGS OF THE 2014 INTERNATIONAL CONFERENCE ON MECHATRONICS, ELECTRONIC, INDUSTRIAL AND CONTROL ENGINEERING, 2014, 5 : 556 - +
  • [40] Measuring Information Flows in Option Markets: A Relative Entropy Approach
    Andre, Eric
    Schneider, Lorenz
    Tavin, Bertrand
    JOURNAL OF DERIVATIVES, 2023, 31 (02): : 73 - 99