Using Weak Supervision in Learning Gaussian Mixture Models

被引:0
|
作者
Ghosh, Soumya [1 ]
Srinivasan, Soundararajan [2 ]
Andrews, Burton [2 ]
机构
[1] Univ Colorado, Dept Comp Sci, Boulder, CO 80309 USA
[2] Robert Bosch LLC, Res & Technol Ctr, Pittsburgh, PA 15212 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The expectation maximization algorithm is a popular approach to learning Gaussian mixture models from unlabeled data. In addition to the unlabeled data, in many applications, additional sources of information such as a-priori knowledge of mixing proportions are also available. We present a weakly supervised approach, in the form of a penalized expectation maximization algorithm that uses a-priori knowledge to guide the model training process. The algorithm penalizes those models whose predicted mixing proportions have high divergence from the a-priori mixing proportions. We also present an extension to incorporate both labeled and unlabeled data in a semi-supervised setting. Systematic evaluations on several publicly available datasets show that the proposed algorithms outperforms the expectation maximization algorithm. The performance gains are particularly significant when the amount of unlabeled data is limited and in the presence of noise.
引用
收藏
页码:2389 / +
页数:2
相关论文
共 50 条
  • [1] Gaussian Mixture Models for Affordance Learning using Bayesian Networks
    Osorio, Pedro
    Bernardino, Alexandre
    Martinez-Cantin, Ruben
    Santos-Victor, Jose
    [J]. IEEE/RSJ 2010 INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2010), 2010,
  • [2] Learning Physical Properties of Objects Using Gaussian Mixture Models
    Hassani, Kaveh
    Lee, Won-Sook
    [J]. ADVANCES IN ARTIFICIAL INTELLIGENCE, CANADIAN AI 2017, 2017, 10233 : 179 - 190
  • [3] Learning Dependency Structures for Weak Supervision Models
    Varma, Paroma
    Sala, Frederic
    He, Ann
    Ratner, Alexander
    Re, Christopher
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [4] Variational learning for Gaussian mixture models
    Nasios, Nikolaos
    Bors, Adrian G.
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2006, 36 (04): : 849 - 862
  • [5] Policy Learning Using Weak Supervision
    Wang, Jingkang
    Guo, Hongyi
    Zhu, Zhaowei
    Liu, Yang
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [6] Antenna Classification Using Gaussian Mixture Models (GMM) and Machine Learning
    Ma, Yihan
    Hao, Yang
    [J]. IEEE OPEN JOURNAL OF ANTENNAS AND PROPAGATION, 2020, 1 (01): : 320 - 328
  • [7] Traffic Classification and Verification using Unsupervised Learning of Gaussian Mixture Models
    Alizadeh, Hassan
    Khoshrou, Abdolrahman
    Zuquete, Andre
    [J]. 2015 IEEE INTERNATIONAL WORKSHOP ON MEASUREMENTS AND NETWORKING (M&N), 2015, : 94 - 99
  • [8] Unsupervised learning of correlated multivariate Gaussian mixture models using MML
    Agusta, Y
    Dowe, DL
    [J]. AI 2003: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2003, 2903 : 477 - 489
  • [9] On particle dispersion statistics using unsupervised learning and Gaussian mixture models
    Christakis, Nicholas
    Drikakis, Dimitris
    [J]. PHYSICS OF FLUIDS, 2024, 36 (09)
  • [10] Generalized Competitive Learning of Gaussian Mixture Models
    Lu, Zhiwu
    Ip, Horace H. S.
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2009, 39 (04): : 901 - 909