The Kullback-Leibler Divergence evolution of randomized electromagnetic field

被引:1
|
作者
Shi, Tian [1 ,2 ]
Li, Yunzhou [3 ]
Shi, Qingfan [1 ]
Li, Liangsheng [2 ]
Zheng, Ning [1 ]
机构
[1] Beijing Inst Technol, Sch Phys, Beijing 100081, Peoples R China
[2] Sci & Technol Electromagnet Scattering Lab, Beijing 100854, Peoples R China
[3] Kunming Shipbldg Equipment Res & Test Ctr, Kunming 650051, Yunnan, Peoples R China
基金
中国国家自然科学基金;
关键词
Statistical optics; Kullback-Leibler Divergence; Random media; Randomized electromagnetic field; SCATTERING; DISTRIBUTIONS;
D O I
10.1016/j.optcom.2020.126411
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
The statistical properties of randomized electromagnetic field have been investigated by simulating the scattering of electromagnetic waves by random media (RMs). In the viewpoint of a theoretical requirement, the probability density function of perfectly randomized field satisfies the zero-mean circular Gaussian (ZMCG) statistics. We find that Kullback-Leibler Divergence (KLD) can quantitatively evaluate the randomness of an electromagnetic field to improve the design of RMs. There are two types of RMs which can efficiently randomize the electromagnetic waves only by monotonically increasing the permittivity of scatterers. When both the permittivity and total number of scatterers are fixed, KLD always have a limitation in any practical RM, which are characterized by a plateau on the evolutionary curve.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [2] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    [J]. JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)
  • [3] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    [J]. ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [4] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    [J]. ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [5] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    [J]. CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349
  • [6] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [7] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    [J]. NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593
  • [8] Kullback-Leibler Divergence Metric Learning
    Ji, Shuyi
    Zhang, Zizhao
    Ying, Shihui
    Wang, Liejun
    Zhao, Xibin
    Gao, Yue
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (04) : 2047 - 2058
  • [9] Use of Kullback-Leibler divergence for forgetting
    Karny, Miroslav
    Andrysek, Josef
    [J]. INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 961 - 975
  • [10] Statistical Estimation of the Kullback-Leibler Divergence
    Bulinski, Alexander
    Dimitrov, Denis
    [J]. MATHEMATICS, 2021, 9 (05) : 1 - 36