A Scalable Shannon Entropy Estimator

被引:1
|
作者
Golia, Priyanka [1 ,2 ]
Juba, Brendan [3 ]
Meel, Kuldeep S. [2 ]
机构
[1] Indian Inst Technol Kanpur, Kanpur, Uttar Pradesh, India
[2] Natl Univ Singapore, Singapore, Singapore
[3] Washington Univ, St Louis, MO 63110 USA
基金
新加坡国家研究基金会;
关键词
INFORMATION-FLOW; MODEL;
D O I
10.1007/978-3-031-13185-1_18
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Quantified information flow (QIF) has emerged as a rigorous approach to quantitatively measure confidentiality; the informationtheoretic underpinning of QIF allows the end-users to link the computed quantities with the computational effort required on the part of the adversary to gain access to desired confidential information. In this work, we focus on the estimation of Shannon entropy for a given program.. As a first step, we focus on the case wherein a Boolean formula phi(X, Y) captures the relationship between inputs X and output Y of Pi. Such formulas.(X, Y) have the property that for every valuation to X, there exists exactly one valuation to Y such that. is satisfied. The existing techniques require O( 2m) model counting queries, where m = | Y |. We propose the first efficient algorithmic technique, called Entropy Estimation to estimate the Shannon entropy of. with PAC-style guarantees, i.e., the computed estimate is guaranteed to lie within a (1 +/- e)factor of the ground truth with confidence at least 1 - delta. Furthermore, EntropyEstimation makes only O(min(m,n)/epsilon(2)) counting and sampling queries, where m = vertical bar Y vertical bar, and n = vertical bar X vertical bar, thereby achieving a significant reduction in the number of model counting queries. We demonstrate the practical efficiency of our algorithmic framework via a detailed experimental evaluation. Our evaluation demonstrates that the proposed framework scales to the formulas beyond the reach of the previously known approaches.
引用
收藏
页码:363 / 384
页数:22
相关论文
共 50 条
  • [31] Polynomial Functors and Shannon Entropy
    Spivak, David I.
    ELECTRONIC PROCEEDINGS IN THEORETICAL COMPUTER SCIENCE, 2023, (380): : 331 - 343
  • [32] SHANNON-INFORMATION IS NOT ENTROPY
    SCHIFFER, M
    PHYSICS LETTERS A, 1991, 154 (7-8) : 361 - 365
  • [33] Statistical Estimation of the Shannon Entropy
    Bulinski, Alexander
    Dimitrov, Denis
    ACTA MATHEMATICA SINICA-ENGLISH SERIES, 2019, 35 (01) : 17 - 46
  • [34] Renyi extrapolation of Shannon entropy
    Zyczkowski, K
    OPEN SYSTEMS & INFORMATION DYNAMICS, 2003, 10 (03): : 297 - 310
  • [35] CYCLIC SYMMETRY AND THE SHANNON ENTROPY
    NATH, P
    KAUR, MM
    INFORMATION SCIENCES, 1981, 25 (03) : 217 - 234
  • [36] Estimating the variance of Shannon entropy
    Ricci, Leonardo
    Perinelli, Alessio
    Castelluzzo, Michele
    PHYSICAL REVIEW E, 2021, 104 (02)
  • [37] SQUEEZED STATES AND SHANNON ENTROPY
    ALIAGA, J
    CRESPO, G
    PROTO, AN
    PHYSICAL REVIEW A, 1994, 49 (06): : 5146 - 5148
  • [38] Statistical Estimation of the Shannon Entropy
    Alexander BULINSKI
    Denis DIMITROV
    Acta Mathematica Sinica, 2019, 35 (01) : 17 - 46
  • [39] On the relation between the Shannon entropy and the von Neumann entropy
    Ho, SW
    Yeung, RW
    2004 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, PROCEEDINGS, 2004, : 42 - 42
  • [40] AN INTRODUCTION TO LOGICAL ENTROPY AND ITS RELATION TO SHANNON ENTROPY
    Ellerman, David
    INTERNATIONAL JOURNAL OF SEMANTIC COMPUTING, 2013, 7 (02) : 121 - 145