Water from Two Rocks: Maximizing the Mutual Information

被引:13
|
作者
Kong, Yuqing [1 ]
Schoenebeck, Grant [1 ]
机构
[1] Univ Michigan, Ann Arbor, MI 48109 USA
基金
美国国家科学基金会;
关键词
Peer prediction; co-training; information theory; SCORING RULES; LIKELIHOOD; PREDICTION; DIVERGENCE;
D O I
10.1145/3219166.3219194
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We build a natural connection between the learning problem, co-training, and forecast elicitation without verification (related to peer-prediction) and address them simultaneously using the same information theoretic approach.(1) In co-training/multiview learning [5] the goal is to aggregate two views of data into a prediction for a latent label. We show how to optimally combine two views of data by reducing the problem to an optimization problem. Our work gives a unified and rigorous approach to the general setting. In forecast elicitation without verification we seek to design a mechanism that elicits high quality forecasts from agents in the setting where the mechanism does not have access to the ground truth. By assuming the agents' information is independent conditioning on the outcome, we propose mechanisms where truth-telling is a strict equilibrium for both the single-task and multi-task settings. Our multi-task mechanism additionally has the property that the truth-telling equilibrium pays better than any other strategy profile and strictly better than any other "non-permutation" strategy profile.
引用
收藏
页码:177 / 194
页数:18
相关论文
共 50 条
  • [1] On Binary Quantizer For Maximizing Mutual Information
    Nguyen, Thuan Duc
    Nguyen, Thinh
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2020, 68 (09) : 5435 - 5445
  • [2] GLACIER SURFACE MONITORING BY MAXIMIZING MUTUAL INFORMATION
    Erten, Esra
    Rossi, Cristian
    Hajnsek, Irena
    XXII ISPRS CONGRESS, TECHNICAL COMMISSION VII, 2012, 39 (B7): : 41 - 44
  • [3] Feature Selection by Maximizing Part Mutual Information
    Gao, Wanfu
    Hu, Liang
    Zhang, Ping
    2018 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND MACHINE LEARNING (SPML 2018), 2018, : 120 - 127
  • [4] Clustering by Maximizing Mutual Information Across Views
    Kien Do
    Truyen Tran
    Venkatesh, Svetha
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9908 - 9918
  • [5] Estimating and Maximizing Mutual Information for Knowledge Distillation
    Shrivastava, Aman
    Qi, Yanjun
    Ordonez, Vicente
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW, 2023, : 48 - 57
  • [6] Learning Representations by Maximizing Mutual Information in Variational Autoencoders
    Rezaabad, Ali Lotfi
    Vishwanath, Sriram
    2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 2729 - 2734
  • [7] Learning Representations by Maximizing Mutual Information Across Views
    Bachman, Philip
    Hjelm, R. Devon
    Buchwalter, William
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [8] Learning LBP structure by maximizing the conditional mutual information
    Ren, Jianfeng
    Jiang, Xudong
    Yuan, Junsong
    PATTERN RECOGNITION, 2015, 48 (10) : 3180 - 3190
  • [9] Automatic Threshold Selection Guided by Maximizing Normalized Mutual Information
    Zou Y.-B.
    Lei B.-J.
    Zang Z.-X.
    Wang J.-Y.
    Hu Z.-H.
    Dong F.-M.
    Zidonghua Xuebao/Acta Automatica Sinica, 2019, 45 (07): : 1373 - 1385
  • [10] Automatic Extrinsic Calibration of Vision and Lidar by Maximizing Mutual Information
    Pandey, Gaurav
    McBride, James R.
    Savarese, Silvio
    Eustice, Ryan M.
    JOURNAL OF FIELD ROBOTICS, 2015, 32 (05) : 696 - 722