Geometric Lower Bounds for Distributed Parameter Estimation Under Communication Constraints

被引:5
|
作者
Han, Yanjun [1 ,2 ]
Ozgur, Ayfer [1 ]
Weissman, Tsachy [1 ]
机构
[1] Stanford Univ, Dept Elect Engn, Stanford, CA 94305 USA
[2] Univ Calif Berkeley, Simons Inst Theory Comp, Berkeley, CA 94720 USA
基金
美国国家科学基金会;
关键词
Distributed estimation; minimax lower bound; high-dimensional geometry; blackboard communication protocol; strong data processing inequality; INFERENCE;
D O I
10.1109/TIT.2021.3108952
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider parameter estimation in distributed networks, where each sensor in the network observes an independent sample from an underlying distribution and has k bits to communicate its sample to a centralized processor which computes an estimate of a desired parameter. We develop lower bounds for the minimax risk of estimating the underlying parameter for a large class of losses and distributions. Our results show that under mild regularity conditions, the communication constraint reduces the effective sample size by a factor of d when k is small, where d is the dimension of the estimated parameter. Furthermore, this penalty reduces at most exponentially with increasing k, which is the case for some models, e.g., estimating high-dimensional distributions. For other models however, we show that the sample size reduction is re-mediated only linearly with increasing k, e. g. when some sub-Gaussian structure is available. We apply our results to the distributed setting with product Bernoulli model, multinomial model, Gaussian location models, and logistic regression which recover or strengthen existing results. Our approach significantly deviates from existing approaches for developing informationtheoretic lower bounds for communication-efficient estimation. We circumvent the need for strong data processing inequalities used in prior work and develop a geometric approach which builds on a new representation of the communication constraint. This approach allows us to strengthen and generalize existing results with simpler and more transparent proofs.
引用
收藏
页码:8248 / 8263
页数:16
相关论文
共 50 条
  • [1] Lower Bounds on Parameter Modulation-Estimation Under Bandwidth Constraints
    Weinberger, Nir
    Merhav, Neri
    [J]. 2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 2093 - 2097
  • [2] Lower Bounds on Parameter Modulation-Estimation Under Bandwidth Constraints
    Weinberger, Nir
    Merhav, Neri
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (06) : 3854 - 3874
  • [3] Pointwise Bounds for Distribution Estimation under Communication Constraints
    Chen, Wei-Ning
    Kairouz, Peter
    Ozgur, Ayfer
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] Distributed sensing and estimation under communication constraints
    Mostofi, Yasamin
    Murray, Richard M.
    [J]. PROCEEDINGS OF THE 45TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-14, 2006, : 1013 - +
  • [5] ASYMPTOTICALLY OPTIMAL PARAMETER ESTIMATION UNDER COMMUNICATION CONSTRAINTS
    Fellouris, Georgios
    [J]. ANNALS OF STATISTICS, 2012, 40 (04): : 2239 - 2265
  • [6] On lower bounds for deterministic parameter estimation
    Forster, P
    Larzabal, P
    [J]. 2002 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS I-IV, PROCEEDINGS, 2002, : 1137 - 1140
  • [7] Lower Bounds for Quantum Parameter Estimation
    Walter, Michael
    Renes, Joseph M.
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (12) : 8007 - 8023
  • [8] Lower bounds for learning distributions under communication constraints via fisher information
    Barnes, Leighton Pate
    Han, Yanjun
    Özgür, Ayfer
    [J]. 1600, Microtome Publishing (21):
  • [9] Lower Bounds for Learning Distributions under Communication Constraints via Fisher Information
    Barnes, Leighton Pate
    Han, Yanjun
    Ozgur, Ayfer
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [10] LOWER BOUNDS FOR PARAMETRIC-ESTIMATION WITH CONSTRAINTS
    GORMAN, JD
    HERO, AO
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 1990, 36 (06) : 1285 - 1301