Information-Maximization Clustering Based on Squared-Loss Mutual Information

被引:26
|
作者
Sugiyama, Masashi [1 ]
Niu, Gang [1 ]
Yamada, Makoto [2 ]
Kimura, Manabu [1 ]
Hachiya, Hirotaka [1 ]
机构
[1] Tokyo Inst Technol, Merugo Ku, Tokyo 1528552, Japan
[2] Yahoo Labs, Sunnyvale, CA 94089 USA
关键词
VARIATIONAL INFERENCE; MEAN-SHIFT; K-MEANS; MIXTURES; HARDNESS;
D O I
10.1162/NECO_a_00534
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Information-maximization clustering learns a probabilistic classifier in an unsupervised manner so that mutual information between feature vectors and cluster assignments is maximized. A notable advantage of this approach is that it involves only continuous optimization of model parameters, which is substantially simpler than discrete optimization of cluster assignments. However, existing methods still involve nonconvex optimization problems, and therefore finding a good local optimal solution is not straightforward in practice. In this letter, we propose an alternative information-maximization clustering method based on a squared-loss variant of mutual information. This novel approach gives a clustering solution analytically in a computationally efficient way via kernel eigenvalue decomposition. Furthermore, we provide a practical model selection procedure that allows us to objectively optimize tuning parameters included in the kernel function. Through experiments, we demonstrate the usefulness of the proposed approach.
引用
收藏
页码:84 / 131
页数:48
相关论文
共 50 条
  • [1] Machine Learning with Squared-Loss Mutual Information
    Sugiyama, Masashi
    ENTROPY, 2013, 15 (01) : 80 - 112
  • [2] Canonical dependency analysis based on squared-loss mutual information
    Karasuyama, Masayuki
    Sugiyama, Masashi
    NEURAL NETWORKS, 2012, 34 : 46 - 55
  • [3] Estimating Squared-Loss Mutual Information for Independent Component Analysis
    Suzuki, Taiji
    Sugiyama, Masashi
    INDEPENDENT COMPONENT ANALYSIS AND SIGNAL SEPARATION, PROCEEDINGS, 2009, 5441 : 130 - +
  • [4] Cross-Domain Matching with Squared-Loss Mutual Information
    Yamada, Makoto
    Sigal, Leonid
    Raptis, Michalis
    Toyoda, Machiko
    Chang, Yi
    Sugiyama, Masashi
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2015, 37 (09) : 1764 - 1776
  • [5] Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation
    Suzuki, Taiji
    Sugiyama, Masashi
    NEURAL COMPUTATION, 2013, 25 (03) : 725 - 758
  • [6] Registration of infrared transmission images using squared-loss mutual information
    Sakai, Tomoya
    Sugiyama, Masashi
    Kitagawa, Katsuichi
    Suzuki, Kazuyoshi
    PRECISION ENGINEERING-JOURNAL OF THE INTERNATIONAL SOCIETIES FOR PRECISION ENGINEERING AND NANOTECHNOLOGY, 2015, 39 : 187 - 193
  • [7] Semi-supervised information-maximization clustering
    Calandriello, Daniele
    Niu, Gang
    Sugiyama, Masashi
    NEURAL NETWORKS, 2014, 57 : 103 - 111
  • [8] Computationally Efficient Estimation of Squared-Loss Mutual Information with Multiplicative Kernel Models
    Sakai, Tomoya
    Sugiyama, Masashi
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2014, E97D (04): : 968 - 971
  • [9] Feature Selection via l1-Penalized Squared-Loss Mutual Information
    Jitkrittum, Wittawat
    Hachiya, Hirotaka
    Sugiyama, Masashi
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2013, E96D (07) : 1513 - 1524
  • [10] Squared-Loss Mutual Information via High-Dimension Coherence Matrix Estimation
    de Cabrera, Ferran
    Riba, Jaume
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 5142 - 5146