Maximizing entropy over Markov processes

被引:14
|
作者
Biondi, Fabrizio [1 ]
Legay, Axel [1 ]
Nielsen, Bo Friis [2 ]
Wasowski, Andrzej [3 ]
机构
[1] IRISA INRIA Rennes, F-35042 Rennes, France
[2] Tech Univ Denmark, DK-2800 Lyngby, Denmark
[3] IT Univ Copenhagen, DK-2300 Copenhagen S, Denmark
关键词
INFORMATION-FLOW; NONINTERFERENCE;
D O I
10.1016/j.jlamp.2014.05.001
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The channel capacity of a deterministic system with confidential data is an upper bound on the amount of bits of data an attacker can learn from the system. We encode all possible attacks to a system using a probabilistic specification, an Interval Markov Chain. Then the channel capacity computation reduces to finding a model of a specification with highest entropy. Entropy maximization for probabilistic process specifications has not been studied before, even though it is well known in Bayesian inference for discrete distributions. We give a characterization of global entropy of a process as a reward function, a polynomial algorithm to verify the existence of a system maximizing entropy among those respecting a specification, a procedure for the maximization of reward functions over Interval Markov Chains and its application to synthesize an implementation maximizing entropy. We show how to use Interval Markov Chains to model abstractions of deterministic systems with confidential data, and use the above results to compute their channel capacity. These results are a foundation for ongoing work on computing channel capacity for abstractions of programs derived from code. (C) 2014 Elsevier Inc. All rights reserved.
引用
收藏
页码:384 / 399
页数:16
相关论文
共 50 条
  • [21] Entropy Maximization for Partially Observable Markov Decision Processes
    Savas, Yagiz
    Hibbard, Michael
    Wu, Bo
    Tanaka, Takashi
    Topcu, Ufuk
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (12) : 6948 - 6955
  • [22] New bounds on the entropy rate of hidden Markov processes
    Ordentlich, E
    Weissman, T
    2004 IEEE INFORMATION THEORY WORKSHOP, PROCEEDINGS, 2004, : 117 - 122
  • [24] Markov and non-Markov processes in complex systems by the dynamical information entropy
    Yulmetyev, RM
    Gafarov, FM
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 1999, 274 (1-2) : 381 - 384
  • [25] Relative Entropy and Error Bounds for Filtering of Markov Processes
    J. M. C. Clark
    D. L. Ocone
    C. Coumarbatch
    Mathematics of Control, Signals and Systems, 1999, 12 : 346 - 360
  • [26] Thermodynamics of Markov processes with nonextensive entropy and free energy
    Peng, Liangrong
    Qian, Hong
    Hong, Liu
    PHYSICAL REVIEW E, 2020, 101 (02)
  • [27] Entropy of Hidden Markov Processes via Cycle Expansion
    Allahverdyan, Armen E.
    JOURNAL OF STATISTICAL PHYSICS, 2008, 133 (03) : 535 - 564
  • [28] Entropy of Hidden Markov Processes via Cycle Expansion
    Armen E. Allahverdyan
    Journal of Statistical Physics, 2008, 133 : 535 - 564
  • [29] On the Entropy of Fractionally Integrated Gauss-Markov Processes
    Abundo, Mario
    Pirozzi, Enrica
    MATHEMATICS, 2020, 8 (11) : 1 - 10
  • [30] Maximum entropy mixing time of circulant Markov processes
    Avrachenkov, Konstantin
    Cottatellucci, Laura
    Maggi, Lorenzo
    Mao, Yong-Hua
    STATISTICS & PROBABILITY LETTERS, 2013, 83 (03) : 768 - 773