Conditional Density Estimation with Dimensionality Reduction via Squared-Loss Conditional Entropy Minimization

被引:10
|
作者
Tangkaratt, Voot [1 ]
Xie, Ning [1 ]
Sugiyama, Masashi [1 ]
机构
[1] Tokyo Inst Technol, Dept Comp Sci, Meguro Ku, Tokyo 1528552, Japan
关键词
INVERSE REGRESSION; DIVERGENCE;
D O I
10.1162/NECO_a_00683
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Regression aims at estimating the conditional mean of output given input. However, regression is not informative enough if the conditional density is multimodal, heteroskedastic, and asymmetric. In such a case, estimating the conditional density itself is preferable, but conditional density estimation (CDE) is challenging in high-dimensional space. A naive approach to coping with high dimensionality is to first perform dimensionality reduction (DR) and then execute CDE. However, a two-step process does not perform well in practice because the error incurred in the first DR step can be magnified in the second CDE step. In this letter, we propose a novel single-shot procedure that performs CDE and DR simultaneously in an integrated way. Our key idea is to formulate DR as the problem of minimizing a squared-loss variant of conditional entropy, and this is solved using CDE. Thus, an additional CDE step is not needed after DR. We demonstrate the usefulness of the proposed method through extensive experiments on various data sets, including humanoid robot transition and computer art.
引用
收藏
页码:228 / 254
页数:27
相关论文
共 50 条
  • [1] A Conditional Entropy Minimization Criterion for Dimensionality Reduction and Multiple Kernel Learning
    Hino, Hideitsu
    Murata, Noboru
    NEURAL COMPUTATION, 2010, 22 (11) : 2887 - 2923
  • [2] Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation
    Suzuki, Taiji
    Sugiyama, Masashi
    NEURAL COMPUTATION, 2013, 25 (03) : 725 - 758
  • [3] MINIMAX RATES FOR CONDITIONAL DENSITY ESTIMATION VIA EMPIRICAL ENTROPY
    Bilodeau, Blair
    Foster, Dylan J.
    Roy, Daniel M.
    ANNALS OF STATISTICS, 2023, 51 (02): : 762 - 790
  • [4] Attribute reduction via local conditional entropy
    Yibo Wang
    Xiangjian Chen
    Kai Dong
    International Journal of Machine Learning and Cybernetics, 2019, 10 : 3619 - 3634
  • [5] Attribute reduction via local conditional entropy
    Wang, Yibo
    Chen, Xiangjian
    Dong, Kai
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2019, 10 (12) : 3619 - 3634
  • [6] Dimension Reduction and Adaptation in Conditional Density Estimation
    Efromovich, Sam
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2010, 105 (490) : 761 - 774
  • [7] Squared-Loss Mutual Information via High-Dimension Coherence Matrix Estimation
    de Cabrera, Ferran
    Riba, Jaume
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 5142 - 5146
  • [8] On conditional density estimation
    De Gooijer, JG
    Zerom, D
    STATISTICA NEERLANDICA, 2003, 57 (02) : 159 - 176
  • [9] Conditional Graph Entropy as an Alternating Minimization Problem
    Harangi, Viktor
    Niu, Xueyan
    Bai, Bo
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (02) : 904 - 919
  • [10] Sliced Inverse Regression with Conditional Entropy Minimization
    Hino, Hideitsu
    Wakayama, Keigo
    Murata, Noboru
    2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 1185 - 1188