A Scalable Shannon Entropy Estimator

被引:1
|
作者
Golia, Priyanka [1 ,2 ]
Juba, Brendan [3 ]
Meel, Kuldeep S. [2 ]
机构
[1] Indian Inst Technol Kanpur, Kanpur, Uttar Pradesh, India
[2] Natl Univ Singapore, Singapore, Singapore
[3] Washington Univ, St Louis, MO 63110 USA
基金
新加坡国家研究基金会;
关键词
INFORMATION-FLOW; MODEL;
D O I
10.1007/978-3-031-13185-1_18
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Quantified information flow (QIF) has emerged as a rigorous approach to quantitatively measure confidentiality; the informationtheoretic underpinning of QIF allows the end-users to link the computed quantities with the computational effort required on the part of the adversary to gain access to desired confidential information. In this work, we focus on the estimation of Shannon entropy for a given program.. As a first step, we focus on the case wherein a Boolean formula phi(X, Y) captures the relationship between inputs X and output Y of Pi. Such formulas.(X, Y) have the property that for every valuation to X, there exists exactly one valuation to Y such that. is satisfied. The existing techniques require O( 2m) model counting queries, where m = | Y |. We propose the first efficient algorithmic technique, called Entropy Estimation to estimate the Shannon entropy of. with PAC-style guarantees, i.e., the computed estimate is guaranteed to lie within a (1 +/- e)factor of the ground truth with confidence at least 1 - delta. Furthermore, EntropyEstimation makes only O(min(m,n)/epsilon(2)) counting and sampling queries, where m = vertical bar Y vertical bar, and n = vertical bar X vertical bar, thereby achieving a significant reduction in the number of model counting queries. We demonstrate the practical efficiency of our algorithmic framework via a detailed experimental evaluation. Our evaluation demonstrates that the proposed framework scales to the formulas beyond the reach of the previously known approaches.
引用
收藏
页码:363 / 384
页数:22
相关论文
共 50 条
  • [41] A New Estimator of Entropy
    Noughabi, Hadi Alizadeh
    Arghami, Naser Reza
    JIRSS-JOURNAL OF THE IRANIAN STATISTICAL SOCIETY, 2010, 9 (01): : 53 - 64
  • [42] Application of Shannon Wavelet Entropy and Shannon Wavelet Packet Entropy in Analysis of Power System Transient Signals
    Chen, Jikai
    Dou, Yanhui
    Li, Yang
    Li, Jiang
    ENTROPY, 2016, 18 (12):
  • [43] A NEW ESTIMATOR OF ENTROPY
    CORREA, JC
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1995, 24 (10) : 2439 - 2449
  • [44] Representation of Cognitive States by Shannon Entropy
    Gunal Degirmendereli, Gonul
    Yarman Vural, Fatos T.
    29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
  • [45] Quantifying fluid mixing with the Shannon entropy
    Camesasca, Marco
    Kaufman, Miron
    Manas-Zloczower, Ica
    MACROMOLECULAR THEORY AND SIMULATIONS, 2006, 15 (08) : 595 - 607
  • [46] Controlling the Shannon Entropy of Quantum Systems
    Xing, Yifan
    Wu, Jun
    SCIENTIFIC WORLD JOURNAL, 2013,
  • [47] DIVERGENCE MEASURES BASED ON THE SHANNON ENTROPY
    LIN, JH
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1991, 37 (01) : 145 - 151
  • [48] Guesswork, Large Deviations, and Shannon Entropy
    Christiansen, Mark M.
    Duffy, Ken R.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2013, 59 (02) : 796 - 802
  • [49] The Shannon Information entropy of protein sequences
    Strait, BJ
    Dewey, TG
    BIOPHYSICAL JOURNAL, 1996, 71 (01) : 148 - 155
  • [50] Feature fusion based on Shannon entropy
    Peng, Weimin
    Deng, Huifang
    Journal of Computational Information Systems, 2013, 9 (19): : 7575 - 7582