On Binary Quantizer For Maximizing Mutual Information

被引:11
|
作者
Nguyen, Thuan Duc [1 ]
Nguyen, Thinh [1 ]
机构
[1] Oregon State Univ, Sch Elect Engn & Comp Sci, Corvallis, OR 97331 USA
关键词
Channel quantization; mutual information; threshold; optimization;
D O I
10.1109/TCOMM.2020.3002910
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We consider a channel with a binary input X being corrupted by a continuous-valued noise that results in a continuous-valued output Y. An optimal binary quantizer is used to quantize the continuous-valued output Y to the final binary output Z to maximize the mutual information I(X; Z). We show that when the ratio of the channel conditional density r(y) = P (Y =y| X=0)/P (Y =y| X=1) is a strictly increasing or decreasing function of y, then a quantizer having a single threshold can maximize mutual information. Furthermore, we show that an optimal quantizer (possibly with multiple thresholds) is the one with the thresholding vector whose elements are all the solutions of r(y) = r* for some constant r* > 0. In addition, we also characterize necessary conditions using fixed point theorem for the optimality and uniqueness of a quantizer. Based on these conditions, we propose an efficient procedure for determining all locally optimal quantizers, and thus, a globally optimal quantizer can be found. Our results also confirm some previous results using alternative elementary proofs.
引用
收藏
页码:5435 / 5445
页数:11
相关论文
共 50 条
  • [1] Optimal Quantizer Structure for Maximizing Mutual Information Under Constraints
    Nguyen, Thuan
    Nguyen, Thinh
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2021, 69 (11) : 7406 - 7413
  • [2] A mutual information-maximizing quantizer based on the noise-injected threshold array
    Zhai, Qiqing
    Wang, Youguo
    DIGITAL SIGNAL PROCESSING, 2024, 146
  • [3] On Mutual Information-Based Optimal Quantizer Design
    Dulek, Berkan
    IEEE COMMUNICATIONS LETTERS, 2022, 26 (05) : 1008 - 1011
  • [4] Optimal Thresholding Quantizer Maximizing Mutual Information of Discrete Multiple-Input Continuous One-Bit Output Quantization
    Nguyen, Thuan
    Nguyen, Thinh
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 646 - 651
  • [5] GLACIER SURFACE MONITORING BY MAXIMIZING MUTUAL INFORMATION
    Erten, Esra
    Rossi, Cristian
    Hajnsek, Irena
    XXII ISPRS CONGRESS, TECHNICAL COMMISSION VII, 2012, 39 (B7): : 41 - 44
  • [6] Feature Selection by Maximizing Part Mutual Information
    Gao, Wanfu
    Hu, Liang
    Zhang, Ping
    2018 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND MACHINE LEARNING (SPML 2018), 2018, : 120 - 127
  • [7] Clustering by Maximizing Mutual Information Across Views
    Kien Do
    Truyen Tran
    Venkatesh, Svetha
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9908 - 9918
  • [8] Estimating and Maximizing Mutual Information for Knowledge Distillation
    Shrivastava, Aman
    Qi, Yanjun
    Ordonez, Vicente
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW, 2023, : 48 - 57
  • [9] Learning Representations by Maximizing Mutual Information in Variational Autoencoders
    Rezaabad, Ali Lotfi
    Vishwanath, Sriram
    2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 2729 - 2734
  • [10] Learning Representations by Maximizing Mutual Information Across Views
    Bachman, Philip
    Hjelm, R. Devon
    Buchwalter, William
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32