Private and Communication-Efficient Algorithms for Entropy Estimation

被引:0
|
作者
Bravo-Hermsdorff, Gecia [1 ]
Busa-Fekete, Robert [2 ]
Ghavamzadeh, Mohammad [2 ]
Medina, Andres Munoz [2 ]
Syed, Umar [2 ]
机构
[1] UCL, Dept Stat, London, England
[2] Google Res, Mountain View, CA USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Modern statistical estimation is often performed in a distributed setting where each sample belongs to a single user who shares their data with a central server. Users are typically concerned with preserving the privacy of their samples, and also with minimizing the amount of data they must transmit to the server. We give improved private and communication-efficient algorithms for estimating several popular measures of the entropy of a distribution. All of our algorithms have constant communication cost and satisfy local differential privacy. For a joint distribution over many variables whose conditional independence is given by a tree, we describe algorithms for estimating Shannon entropy that require a number of samples that is linear in the number of variables, compared to the quadratic sample complexity of prior work. We also describe an algorithm for estimating Gini entropy whose sample complexity has no dependence on the support size of the distribution and can be implemented using a single round of concurrent communication between the users and the server. In contrast, the previously best-known algorithm has high communication cost and requires the server to facilitate interaction between the users. Finally, we describe an algorithm for estimating collision entropy that matches the space and sample complexity of the best known algorithm but generalizes it to the private and communication-efficient setting.
引用
下载
收藏
页数:12
相关论文
共 50 条
  • [41] CoPriv: Network/Protocol Co-Optimization for Communication-Efficient Private Inference
    Zeng, Wenxuan
    Li, Meng
    Yang, Haichuan
    Lu, Wen-jie
    Wang, Runsheng
    Huang, Ru
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [42] Distributed Gaussian Mean Estimation under Communication Constraints: Optimal Rates and Communication-Efficient
    Cai, T. Tony
    Wei, Hongji
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 63
  • [43] Communication-Efficient Randomized Consensus
    Alistarh, Dan
    Aspnes, James
    King, Valerie
    Saia, Jared
    DISTRIBUTED COMPUTING (DISC 2014), 2014, 8784 : 61 - 75
  • [44] Communication-efficient parallel sorting
    Goodrich, Michael T.
    SIAM Journal on Computing, 29 (02): : 416 - 432
  • [45] Communication-efficient parallel sorting
    Goodrich, MT
    SIAM JOURNAL ON COMPUTING, 1999, 29 (02) : 416 - 432
  • [46] Communication-Efficient String Sorting
    Bingmann, Timo
    Sanders, Peter
    Schimek, Matthias
    2020 IEEE 34TH INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM IPDPS 2020, 2020, : 137 - 147
  • [47] Communication-efficient randomized consensus
    Alistarh, Dan
    Aspnes, James
    King, Valerie
    Saia, Jared
    DISTRIBUTED COMPUTING, 2018, 31 (06) : 489 - 501
  • [48] EDEN: Communication-Efficient and Robust Distributed Mean Estimation for Federated Learning
    Vargaftik, Shay
    Ben Basat, Ran
    Portnoy, Amit
    Mendelson, Gal
    Ben-Itzhak, Yaniv
    Mitzenmacher, Michael
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [49] Communication-efficient Clock Synchronization
    Fei, Peng
    Chen, Zhen
    Wang, Zhiying
    Jafar, Syed A.
    IEEE International Conference on Communications, 2022, 2022-May : 2592 - 2597
  • [50] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)