Computationally Efficient Estimation of Squared-Loss Mutual Information with Multiplicative Kernel Models

被引:9
|
作者
Sakai, Tomoya [1 ]
Sugiyama, Masashi [1 ]
机构
[1] Tokyo Inst Technol, Dept Comp Sci, Tokyo 1528552, Japan
来源
关键词
squared-loss mutual information; least-squares mutual information; density ratio estimation; multiplicative kernel models; independence test;
D O I
10.1587/transinf.E97.D.968
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Squared-loss mutual information (SMI) is a robust measure of the statistical dependence between random variables. The sample-based SMI approxirnator called least-squares mutual information (LSMI) was demonstrated to be useful in performing various machine learning tasks such as dimension reduction, clustering, and causal inference. The original LSMI approximates the pointwise mutual information by using the kernel model, which is a linear combination of kernel basis functions located on paired data samples. Although LSMI was proved to achieve the optimal approximation accuracy asymptotically, its approximation capability is limited when the sample size is small due to an insufficient number of kernel basis functions. Increasing the number of kernel basis functions can mitigate this weakness, but a naive implementation of this idea significantly increases the computation costs. In this article, we show that the computational complexity of LSMI with the multiplicative kernel model, which locates kernel basis functions on unpaired data samples and thus the number of kernel basis functions is the sample size squared, is the same as that for the plain kernel model. We experimentally demonstrate that LSMI with the multiplicative kernel model is more accurate than that with plain kernel models in small sample cases, with only mild increase in computation time.
引用
收藏
页码:968 / 971
页数:4
相关论文
共 50 条
  • [11] Conditional Density Estimation with Dimensionality Reduction via Squared-Loss Conditional Entropy Minimization
    Tangkaratt, Voot
    Xie, Ning
    Sugiyama, Masashi
    NEURAL COMPUTATION, 2015, 27 (01) : 228 - 254
  • [12] Computationally Efficient Mutual Information Estimation for Non-rigid Image Registration
    Gholipour, Ali
    Kehtarnavaz, Nasser
    2008 15TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1-5, 2008, : 1792 - 1795
  • [13] Minimum Mean Squared Error Estimation and Mutual Information Gain
    Gibson, Jerry
    INFORMATION, 2024, 15 (08)
  • [14] A computationally efficient optimization kernel for material parameter estimation procedures
    Schmid, H.
    Nash, M. P.
    Young, A. A.
    Rohrle, O.
    Hunter, P. J.
    JOURNAL OF BIOMECHANICAL ENGINEERING-TRANSACTIONS OF THE ASME, 2007, 129 (02): : 279 - 283
  • [15] Estimation of mutual information via quantum kernel methods
    Maeda, Yota
    Kawaguchi, Hideaki
    Tezuka, Hiroyuki
    QUANTUM MACHINE INTELLIGENCE, 2025, 7 (01)
  • [16] ESTIMATION OF MUTUAL INFORMATION USING KERNEL DENSITY ESTIMATORS
    MOON, YI
    RAJAGOPALAN, B
    LALL, U
    PHYSICAL REVIEW E, 1995, 52 (03): : 2318 - 2321
  • [17] Kernel density estimation for multiplicative distortion measurement regression models
    Zhang, Jun
    Chen, Aixian
    Wei, Zhenghong
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2023, 52 (05) : 1733 - 1752
  • [18] Maximum nonparametric kernel likelihood estimation for multiplicative linear regression models
    Jun Zhang
    Bingqing Lin
    Yiping Yang
    Statistical Papers, 2022, 63 : 885 - 918
  • [19] Maximum nonparametric kernel likelihood estimation for multiplicative linear regression models
    Zhang, Jun
    Lin, Bingqing
    Yang, Yiping
    STATISTICAL PAPERS, 2022, 63 (03) : 885 - 918
  • [20] Efficient Estimation of Mutual Information for Strongly Dependent Variables
    Gao, Shuyang
    Ver Steeg, Greg
    Galstyan, Aram
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 277 - 286