Information Bottleneck Methods for Distributed Learning

被引:0
|
作者
Farajiparvar, Parinaz [1 ]
Beirami, Ahmad [2 ]
Nokleby, Matthew [1 ]
机构
[1] Wayne State Univ, Dept Elect & Comp Engn, Detroit, MI 48202 USA
[2] MIT, Elect Res Lab, Cambridge, MA 02139 USA
关键词
Machine Learning; Rate-distortion function; Information Bottleneck; Distributed Learning; Streaming Data;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study a distributed learning problem in which Alice sends a compressed distillation of a set of training data to Bob, who uses the distilled version to best solve an associated learning problem. We formalize this as a rate-distortion problem in which the training set is the source and Bob's cross-entropy loss is the distortion measure. We consider this problem for unsupervised learning for batch and sequential data. In the batch data, this problem is equivalent to the information bottleneck (IB), and we show that reduced-complexity versions of standard IB methods solve the associated rate-distortion problem. For the streaming data, we present a new algorithm, which may be of independent interest, that solves the rate-distortion problem for Gaussian sources. Furthermore, to improve the results of the iterative algorithm for sequential data we introduce a two-pass version of this algorithm. Finally, we show the dependency of the rate on the number of samples k required for Gaussian sources to ensure cross-entropy loss that scales optimally with the growth of the training set.
引用
收藏
页码:24 / 31
页数:8
相关论文
共 50 条
  • [1] Distributed Cooperative Information Bottleneck
    Vera, Matias
    Rey Vega, Leonardo
    Piantanida, Pablo
    [J]. 2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 709 - 713
  • [2] Distributed Deep Variational Information Bottleneck
    Zaidi, Abdellatif
    Aguerri, Inaki Estella
    [J]. PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [3] Learning and generalization with the information bottleneck
    Shamir, Ohad
    Sabato, Sivan
    Tishby, Naftali
    [J]. THEORETICAL COMPUTER SCIENCE, 2010, 411 (29-30) : 2696 - 2711
  • [4] Learning and Generalization with the Information Bottleneck
    Shamir, Ohad
    Sabato, Sivan
    Tishby, Naftali
    [J]. ALGORITHMIC LEARNING THEORY, PROCEEDINGS, 2008, 5254 : 92 - 107
  • [5] Information Bottleneck and Aggregated Learning
    Soflaei, Masoumeh
    Zhang, Richong
    Guo, Hongyu
    Al-Bashabsheh, Ali
    Mao, Yongyi
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (12) : 14807 - 14820
  • [6] Distributed Compression using the Information Bottleneck Principle
    Steiner, Steffen
    Kuehn, Volker
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021), 2021,
  • [7] Information Bottleneck Classification in Extremely Distributed Systems
    Ullmann, Denis
    Rezaeifar, Shideh
    Taran, Olga
    Holotyak, Taras
    Panos, Brandon
    Voloshynovskiy, Slava
    [J]. ENTROPY, 2020, 22 (11) : 1 - 30
  • [8] DISTRIBUTED VARIATIONAL INFORMATION BOTTLENECK FOR IOT ENVIRONMENTS
    Alsulaimawi, Zahir
    Liu, Huaping
    [J]. 2021 IEEE 31ST INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2021,
  • [9] The Mathematical Structure of Information Bottleneck Methods
    Gedeon, Tomas
    Parker, Albert E.
    Dimitrov, Alexander G.
    [J]. ENTROPY, 2012, 14 (03) : 456 - 479
  • [10] Deep Learning and the Information Bottleneck Principle
    Tishby, Naftali
    Zaslavsky, Noga
    [J]. 2015 IEEE INFORMATION THEORY WORKSHOP (ITW), 2015,