Partial Information Decomposition: Redundancy as Information Bottleneck

被引:0
|
作者
Kolchinsky, Artemy [1 ,2 ]
机构
[1] Univ Pompeu Fabra, ICREA Complex Syst Lab, Barcelona 08003, Spain
[2] Univ Tokyo, Universal Biol Inst, Tokyo 1130033, Japan
关键词
partial information decomposition; information bottleneck; rate distortion; redundancy;
D O I
10.3390/e26070546
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The partial information decomposition (PID) aims to quantify the amount of redundant information that a set of sources provides about a target. Here, we show that this goal can be formulated as a type of information bottleneck (IB) problem, termed the "redundancy bottleneck" (RB). The RB formalizes a tradeoff between prediction and compression: it extracts information from the sources that best predict the target, without revealing which source provided the information. It can be understood as a generalization of "Blackwell redundancy", which we previously proposed as a principled measure of PID redundancy. The "RB curve" quantifies the prediction-compression tradeoff at multiple scales. This curve can also be quantified for individual sources, allowing subsets of redundant sources to be identified without combinatorial optimization. We provide an efficient iterative algorithm for computing the RB curve.
引用
收藏
页数:23
相关论文
共 50 条
  • [31] Network Redundancy and Information Diffusion: The Impacts of Information Redundancy, Similarity, and Tie Strength
    Liang, Hai
    Fu, King-wa
    [J]. COMMUNICATION RESEARCH, 2019, 46 (02) : 250 - 272
  • [32] A Review of Partial Information Decomposition in Algorithmic Fairness and Explainability
    Dutta, Sanghamitra
    Hamman, Faisal
    [J]. ENTROPY, 2023, 25 (05)
  • [33] The Partial Information Decomposition of Generative Neural Network Models
    Tax, Tycho M. S.
    Mediano, Pedro A. M.
    Shanahan, Murray
    [J]. ENTROPY, 2017, 19 (09):
  • [34] Learning and Generalization with the Information Bottleneck
    Shamir, Ohad
    Sabato, Sivan
    Tishby, Naftali
    [J]. ALGORITHMIC LEARNING THEORY, PROCEEDINGS, 2008, 5254 : 92 - 107
  • [35] The Information Bottleneck and Geometric Clustering
    Strouse, D. J.
    Schwab, David J.
    [J]. NEURAL COMPUTATION, 2019, 31 (03) : 596 - 612
  • [36] Orders between Channels and Implications for Partial Information Decomposition
    Gomes, Andre F. C.
    Figueiredo, Mario A. T.
    [J]. ENTROPY, 2023, 25 (07)
  • [37] A Spiking Neuron as Information Bottleneck
    Buesing, Lars
    Maass, Wolfgang
    [J]. NEURAL COMPUTATION, 2010, 22 (08) : 1961 - 1992
  • [38] Information Bottleneck Problem Revisited
    Bayat, Farhang
    Wei, Shuangqing
    [J]. 2019 57TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2019, : 40 - 47
  • [39] Distributed Cooperative Information Bottleneck
    Vera, Matias
    Rey Vega, Leonardo
    Piantanida, Pablo
    [J]. 2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 709 - 713
  • [40] Perturbation Theory for the Information Bottleneck
    Ngampruetikorn, Vudtiwat
    Schwab, David J.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34