On Binary Quantizer For Maximizing Mutual Information

被引:11
|
作者
Nguyen, Thuan Duc [1 ]
Nguyen, Thinh [1 ]
机构
[1] Oregon State Univ, Sch Elect Engn & Comp Sci, Corvallis, OR 97331 USA
关键词
Channel quantization; mutual information; threshold; optimization;
D O I
10.1109/TCOMM.2020.3002910
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We consider a channel with a binary input X being corrupted by a continuous-valued noise that results in a continuous-valued output Y. An optimal binary quantizer is used to quantize the continuous-valued output Y to the final binary output Z to maximize the mutual information I(X; Z). We show that when the ratio of the channel conditional density r(y) = P (Y =y| X=0)/P (Y =y| X=1) is a strictly increasing or decreasing function of y, then a quantizer having a single threshold can maximize mutual information. Furthermore, we show that an optimal quantizer (possibly with multiple thresholds) is the one with the thresholding vector whose elements are all the solutions of r(y) = r* for some constant r* > 0. In addition, we also characterize necessary conditions using fixed point theorem for the optimality and uniqueness of a quantizer. Based on these conditions, we propose an efficient procedure for determining all locally optimal quantizers, and thus, a globally optimal quantizer can be found. Our results also confirm some previous results using alternative elementary proofs.
引用
收藏
页码:5435 / 5445
页数:11
相关论文
共 50 条
  • [31] Mutual information against correlations in binary communication channels
    Agnieszka Pregowska
    Janusz Szczepanski
    Eligiusz Wajnryb
    BMC Neuroscience, 16
  • [32] Fast binary feature selection with conditional mutual information
    Fleuret, F
    JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 5 : 1531 - 1555
  • [33] Asymptotic Mutual Information for the Binary Stochastic Block Model
    Deshpande, Yash
    Abbe, Emmanuel
    Montanari, Andrea
    2016 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2016, : 185 - 189
  • [34] An Unsupervised Sentence Embedding Method by Maximizing the Mutual Information of Augmented Text Representations
    Sheng, Tianye
    Wang, Lisong
    He, Zongfeng
    Sun, Mingjie
    Jiang, Guohua
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT II, 2022, 13530 : 174 - 185
  • [35] GAN-based Image Compression Using Mutual Information Maximizing Regularization
    Kudo, Shinobu
    Orihashi, Shota
    Tanida, Ryuichi
    Shimizu, Atsushi
    2019 PICTURE CODING SYMPOSIUM (PCS), 2019,
  • [36] Compressed Secret Key Agreement: Maximizing Multivariate Mutual Information per Bit
    Chan, Chung
    ENTROPY, 2017, 19 (10)
  • [37] MaNi: Maximizing Mutual Information for Nuclei Cross-Domain Unsupervised Segmentation
    Sharma, Yash
    Syed, Sana
    Brown, Donald E.
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT II, 2022, 13432 : 345 - 355
  • [38] Learning Representations by Maximizing Mutual Information Across Views for Medical Image Segmentation
    Weng, Weihao
    Zhu, Xin
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2024, PT XI, 2024, 15011 : 383 - 393
  • [39] End-to-End Learning of Geometrical Shaping Maximizing Generalized Mutual Information
    Gumus, Kadir
    Alvarado, Alex
    Chen, Bin
    Hager, Christian
    Agrell, Erik
    2020 OPTICAL FIBER COMMUNICATIONS CONFERENCE AND EXPOSITION (OFC), 2020,
  • [40] On Mutual Information-Maximizing Quantized Belief Propagation Decoding of LDPC Codes
    He, Xuan
    Cai, Kui
    Mei, Zhen
    2019 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2019,