Accuracy of latent-variable estimation in Bayesian semi-supervised learning

被引:3
|
作者
Yamazaki, Keisuke [1 ]
机构
[1] Tokyo Inst Technol, Dept Computat Intelligence & Syst Sci, Midori Ku, Yokohama, Kanagawa 227, Japan
关键词
Latent-variable estimation; Generative and discriminative models; Bayes statistics; MAXIMUM-LIKELIHOOD;
D O I
10.1016/j.neunet.2015.04.012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hierarchical probabilistic models, such as Gaussian mixture models, are widely used for unsupervised learning tasks. These models consist of observable and latent variables, which represent the observable data and the underlying data-generation process, respectively. Unsupervised learning tasks, such as cluster analysis, are regarded as estimations of latent variables based on the observable ones. The estimation of latent variables in semi-supervised learning, where some labels are observed, will be more precise than that in unsupervised, and one of the concerns is to clarify the effect of the labeled data. However, there has not been sufficient theoretical analysis of the accuracy of the estimation of latent variables. In a previous study, a distribution-based error function was formulated, and its asymptotic form was calculated for unsupervised learning with generative models. It has been shown that, for the estimation of latent variables, the Bayes method is more accurate than the maximum-likelihood method. The present paper reveals the asymptotic forms of the error function in Bayesian semi-supervised learning for both discriminative and generative models. The results show that the generative model, which uses all of the given data, performs better when the model is well specified. (C) 2015 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1 / 10
页数:10
相关论文
共 50 条
  • [21] Semi-supervised learning with density-ratio estimation
    Masanori Kawakita
    Takafumi Kanamori
    Machine Learning, 2013, 91 : 189 - 209
  • [22] Safe semi-supervised learning using a bayesian neural network
    Bae, Jinsoo
    Lee, Minjung
    Kim, Seoung Bum
    INFORMATION SCIENCES, 2022, 612 : 453 - 464
  • [23] Semi-supervised learning for k-dependence Bayesian classifiers
    LiMin Wang
    XinHao Zhang
    Kuo Li
    Shuai Zhang
    Applied Intelligence, 2022, 52 : 3604 - 3622
  • [24] Semi-supervised learning with density-ratio estimation
    Kawakita, Masanori
    Kanamori, Takafumi
    MACHINE LEARNING, 2013, 91 (02) : 189 - 209
  • [25] Toward a Semi-Supervised Learning Approach to Phylogenetic Estimation
    Silvestro, Daniele
    Latrille, Thibault
    Salamin, Nicolas
    SYSTEMATIC BIOLOGY, 2024, 73 (05) : 789 - 806
  • [26] Siamese Graph Learning for Semi-Supervised Age Estimation
    Liu, Hao
    Ma, Mei
    Gao, Zixian
    Deng, Zongyong
    Li, Fengjun
    Li, Zhendong
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 9586 - 9596
  • [27] An accuracy-maximization learning framework for supervised and semi-supervised imbalanced data
    Wang, Guanjin
    Wong, Kok Wai
    KNOWLEDGE-BASED SYSTEMS, 2022, 255
  • [28] Distribution-free Bayesian regularized learning framework for semi-supervised learning
    Ma, Jun
    Yu, Guolin
    NEURAL NETWORKS, 2024, 174
  • [29] Regularized Semi-Supervised Latent Dirichlet Allocation for visual concept learning
    Zhuang, Liansheng
    Gao, Haoyuan
    Luo, Jiebo
    Lin, Zhouchen
    NEUROCOMPUTING, 2013, 119 : 26 - 32
  • [30] Maximum Reconstruction Estimation for Generative Latent-Variable Models
    Cheng, Yong
    Liu, Yang
    Xu, Wei
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 3173 - 3179