Computationally Efficient Estimation of Squared-Loss Mutual Information with Multiplicative Kernel Models

被引:9
|
作者
Sakai, Tomoya [1 ]
Sugiyama, Masashi [1 ]
机构
[1] Tokyo Inst Technol, Dept Comp Sci, Tokyo 1528552, Japan
来源
关键词
squared-loss mutual information; least-squares mutual information; density ratio estimation; multiplicative kernel models; independence test;
D O I
10.1587/transinf.E97.D.968
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Squared-loss mutual information (SMI) is a robust measure of the statistical dependence between random variables. The sample-based SMI approxirnator called least-squares mutual information (LSMI) was demonstrated to be useful in performing various machine learning tasks such as dimension reduction, clustering, and causal inference. The original LSMI approximates the pointwise mutual information by using the kernel model, which is a linear combination of kernel basis functions located on paired data samples. Although LSMI was proved to achieve the optimal approximation accuracy asymptotically, its approximation capability is limited when the sample size is small due to an insufficient number of kernel basis functions. Increasing the number of kernel basis functions can mitigate this weakness, but a naive implementation of this idea significantly increases the computation costs. In this article, we show that the computational complexity of LSMI with the multiplicative kernel model, which locates kernel basis functions on unpaired data samples and thus the number of kernel basis functions is the sample size squared, is the same as that for the plain kernel model. We experimentally demonstrate that LSMI with the multiplicative kernel model is more accurate than that with plain kernel models in small sample cases, with only mild increase in computation time.
引用
收藏
页码:968 / 971
页数:4
相关论文
共 50 条
  • [1] Machine Learning with Squared-Loss Mutual Information
    Sugiyama, Masashi
    ENTROPY, 2013, 15 (01) : 80 - 112
  • [2] Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation
    Suzuki, Taiji
    Sugiyama, Masashi
    NEURAL COMPUTATION, 2013, 25 (03) : 725 - 758
  • [3] Information-Maximization Clustering Based on Squared-Loss Mutual Information
    Sugiyama, Masashi
    Niu, Gang
    Yamada, Makoto
    Kimura, Manabu
    Hachiya, Hirotaka
    NEURAL COMPUTATION, 2014, 26 (01) : 84 - 131
  • [4] Canonical dependency analysis based on squared-loss mutual information
    Karasuyama, Masayuki
    Sugiyama, Masashi
    NEURAL NETWORKS, 2012, 34 : 46 - 55
  • [5] Estimating Squared-Loss Mutual Information for Independent Component Analysis
    Suzuki, Taiji
    Sugiyama, Masashi
    INDEPENDENT COMPONENT ANALYSIS AND SIGNAL SEPARATION, PROCEEDINGS, 2009, 5441 : 130 - +
  • [6] Cross-Domain Matching with Squared-Loss Mutual Information
    Yamada, Makoto
    Sigal, Leonid
    Raptis, Michalis
    Toyoda, Machiko
    Chang, Yi
    Sugiyama, Masashi
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2015, 37 (09) : 1764 - 1776
  • [7] Squared-Loss Mutual Information via High-Dimension Coherence Matrix Estimation
    de Cabrera, Ferran
    Riba, Jaume
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 5142 - 5146
  • [8] Registration of infrared transmission images using squared-loss mutual information
    Sakai, Tomoya
    Sugiyama, Masashi
    Kitagawa, Katsuichi
    Suzuki, Kazuyoshi
    PRECISION ENGINEERING-JOURNAL OF THE INTERNATIONAL SOCIETIES FOR PRECISION ENGINEERING AND NANOTECHNOLOGY, 2015, 39 : 187 - 193
  • [9] Feature Selection via l1-Penalized Squared-Loss Mutual Information
    Jitkrittum, Wittawat
    Hachiya, Hirotaka
    Sugiyama, Masashi
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2013, E96D (07) : 1513 - 1524
  • [10] A computationally efficient estimator for mutual information
    Evans, Dafydd
    PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2008, 464 (2093): : 1203 - 1215