Accuracy of latent-variable estimation in Bayesian semi-supervised learning

被引:3
|
作者
Yamazaki, Keisuke [1 ]
机构
[1] Tokyo Inst Technol, Dept Computat Intelligence & Syst Sci, Midori Ku, Yokohama, Kanagawa 227, Japan
关键词
Latent-variable estimation; Generative and discriminative models; Bayes statistics; MAXIMUM-LIKELIHOOD;
D O I
10.1016/j.neunet.2015.04.012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hierarchical probabilistic models, such as Gaussian mixture models, are widely used for unsupervised learning tasks. These models consist of observable and latent variables, which represent the observable data and the underlying data-generation process, respectively. Unsupervised learning tasks, such as cluster analysis, are regarded as estimations of latent variables based on the observable ones. The estimation of latent variables in semi-supervised learning, where some labels are observed, will be more precise than that in unsupervised, and one of the concerns is to clarify the effect of the labeled data. However, there has not been sufficient theoretical analysis of the accuracy of the estimation of latent variables. In a previous study, a distribution-based error function was formulated, and its asymptotic form was calculated for unsupervised learning with generative models. It has been shown that, for the estimation of latent variables, the Bayes method is more accurate than the maximum-likelihood method. The present paper reveals the asymptotic forms of the error function in Bayesian semi-supervised learning for both discriminative and generative models. The results show that the generative model, which uses all of the given data, performs better when the model is well specified. (C) 2015 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1 / 10
页数:10
相关论文
共 50 条
  • [41] Semi-supervised regression with manifold: A Bayesian deep kernel learning approach
    Xu, Lu
    Hu, Chen
    Mei, Kuizhi
    Neurocomputing, 2022, 497 : 76 - 85
  • [42] Semi-supervised learning for automatic image annotation based on Bayesian framework
    Tian, D. (tdp211@163.com), 1600, Science and Engineering Research Support Society (07):
  • [43] Semi-Supervised Learning with Mutual Distillation for Monocular Depth Estimation
    Baek, Jongbeom
    Kim, Gyeongnyeon
    Kim, Seungryong
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2022), 2022, : 4562 - 4569
  • [45] Semi-supervised Learning via Conditional Rotation Angle Estimation
    Xu, Hai-Ming
    Liu, Lingqiao
    Gong, Dong
    2021 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA 2021), 2021, : 534 - 541
  • [46] GONet: A Semi-Supervised Deep Learning Approach For Traversability Estimation
    Hirose, Noriaki
    Sadeghian, Amir
    Vazquez, Marynel
    Goebel, Patrick
    Savarese, Silvio
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 3044 - 3051
  • [47] Semi-Supervised Depth Estimation by Multi-Task Learning
    Fu, Qingshun
    Dong, Xuan
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 3765 - 3771
  • [48] Joint Label Propagation, Graph and Latent Subspace Estimation for Semi-supervised Classification
    Dornaika, Fadi
    Baradaaji, Abdullah
    COGNITIVE COMPUTATION, 2024, 16 (03) : 827 - 840
  • [49] Semi-supervised learning based on high density region estimation
    Chen, Hong
    Li, Luoqing
    Peng, Jiangtao
    NEURAL NETWORKS, 2010, 23 (07) : 812 - 818
  • [50] A Semi-supervised Molecular Learning Framework for Activity Cliff Estimation
    Wu, Fang
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 6080 - 6088