Accuracy of latent-variable estimation in Bayesian semi-supervised learning

被引:3
|
作者
Yamazaki, Keisuke [1 ]
机构
[1] Tokyo Inst Technol, Dept Computat Intelligence & Syst Sci, Midori Ku, Yokohama, Kanagawa 227, Japan
关键词
Latent-variable estimation; Generative and discriminative models; Bayes statistics; MAXIMUM-LIKELIHOOD;
D O I
10.1016/j.neunet.2015.04.012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hierarchical probabilistic models, such as Gaussian mixture models, are widely used for unsupervised learning tasks. These models consist of observable and latent variables, which represent the observable data and the underlying data-generation process, respectively. Unsupervised learning tasks, such as cluster analysis, are regarded as estimations of latent variables based on the observable ones. The estimation of latent variables in semi-supervised learning, where some labels are observed, will be more precise than that in unsupervised, and one of the concerns is to clarify the effect of the labeled data. However, there has not been sufficient theoretical analysis of the accuracy of the estimation of latent variables. In a previous study, a distribution-based error function was formulated, and its asymptotic form was calculated for unsupervised learning with generative models. It has been shown that, for the estimation of latent variables, the Bayes method is more accurate than the maximum-likelihood method. The present paper reveals the asymptotic forms of the error function in Bayesian semi-supervised learning for both discriminative and generative models. The results show that the generative model, which uses all of the given data, performs better when the model is well specified. (C) 2015 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1 / 10
页数:10
相关论文
共 50 条
  • [1] Deep Bayesian Active Semi-Supervised Learning
    Rottmann, Matthias
    Kahl, Karsten
    Gottschalk, Hanno
    2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2018, : 158 - 164
  • [2] Semi-supervised learning for Bayesian pattern classification
    Center, JL
    Bayesian Inference and Maximum Entropy Methods in Science and Engineering, 2005, 803 : 517 - 524
  • [3] Semi-supervised binary classification with latent distance learning
    Kamal, Imam Mustafa
    Bae, Hyerim
    ADVANCED ENGINEERING INFORMATICS, 2024, 61
  • [4] Latent Space Virtual Adversarial Training for Supervised and Semi-Supervised Learning
    Osada, Genki
    Ahsan, Budrul
    Prasad Bora, Revoti
    Nishide, Takashi
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (03) : 667 - 678
  • [5] Semi-supervised mixture of latent factor analysis models with application to online key variable estimation
    Shao, Weiming
    Ge, Zhiqiang
    Song, Zhihuan
    CONTROL ENGINEERING PRACTICE, 2019, 84 : 32 - 47
  • [6] Bayesian Semi-supervised Learning with Graph Gaussian Processes
    Ng, Yin Cheng
    Colombo, Nicolo
    Silva, Ricardo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [7] Semi-supervised learning for software quality estimation
    Seliya, N
    Khoshgoftaar, TM
    Zhong, S
    ICTAI 2004: 16TH IEEE INTERNATIONALCONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2004, : 183 - 190
  • [8] Bayesian semi-supervised learning with support vector machine
    Chakraborty, Sounak
    STATISTICAL METHODOLOGY, 2011, 8 (01) : 68 - 82
  • [9] Deep Latent-Variable Kernel Learning
    Liu, Haitao
    Ong, Yew-Soon
    Jiang, Xiaomo
    Wang, Xiaofang
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (10) : 10276 - 10289
  • [10] Semi-supervised Bayesian ARTMAP
    Tang, Xiao-liang
    Han, Min
    APPLIED INTELLIGENCE, 2010, 33 (03) : 302 - 317