Proportional data modeling via entropy-based variational bayes learning of mixture models

被引:11
|
作者
Fan, Wentao [1 ]
Al-Osaimi, Faisal R. [2 ]
Bouguila, Nizar [3 ]
Du, Jixiang [1 ]
机构
[1] Huaqiao Univ, Dept Comp Sci & Technol, Xiamen, Peoples R China
[2] Umm Al Qura Univ, Dept Comp Engn, Coll Comp Syst, Mecca, Saudi Arabia
[3] Concordia Univ, Concordia Inst Informat Syst Engn CIISE, Montreal, PQ, Canada
基金
中国国家自然科学基金; 加拿大自然科学与工程研究理事会;
关键词
Mixture models; Entropy; Variational Bayes; 3D objects; Identity verification; Document clustering; Gene expression; 3D OBJECT RECOGNITION; FACE; FEATURES; ROBUST; AUTHENTICATION; CLASSIFICATION; RETRIEVAL; ALGORITHM; SEARCH; SPEECH;
D O I
10.1007/s10489-017-0909-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
During the last few decades, many statistical approaches that were developed in the fields of computer vision and pattern recognition are based on mixture models. A mixture-based representation has a number of advantages: mixture models are generative, flexible, plus they can take prior information into account to improve the generalization capability. The mixture models that we consider in this paper are based on the Dirichlet and generalized Dirichlet distributions that have been widely used to represent proportional data. The novel aspect of this paper is to develop an entropy-based framework to learn these mixture models. Specifically, we propose a Bayesian framework for model learning by means of a sophisticated entropy-based variational Bayes technique. We present experimental results to show that the proposed method is effective in several applications namely person identity verification, 3D object recognition, text document clustering, and gene expression categorization.
引用
收藏
页码:473 / 487
页数:15
相关论文
共 50 条
  • [21] Entropy-Based Variational Scheme with Component Splitting for the Efficient Learning of Gamma Mixtures
    Bourouis, Sami
    Pawar, Yogesh
    Bouguila, Nizar
    [J]. SENSORS, 2022, 22 (01)
  • [22] Improving Neural Conversational Models with Entropy-Based Data Filtering
    Csaky, Richard
    Purgai, Patrik
    Recski, Gabor
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5650 - 5669
  • [23] Entropy-based matrix learning machine for imbalanced data sets
    Zhu, Changming
    Wang, Zhe
    [J]. PATTERN RECOGNITION LETTERS, 2017, 88 : 72 - 80
  • [24] Entropy-based hybrid sampling ensemble learning for imbalanced data
    Dongdong, Li
    Ziqiu, Chi
    Bolu, Wang
    Zhe, Wang
    Hai, Yang
    Wenli, Du
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2021, 36 (07) : 3039 - 3067
  • [25] VARIATIONAL BAYES LEARNING OF MULTISCALE GRAPHICAL MODELS
    Yu, Hang
    Dauwels, Justin
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 1891 - 1895
  • [26] Testing Homogeneity for Normal Mixture Models: Variational Bayes Approach
    Kariya, Natsuki
    Watanabe, Sumio
    [J]. IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2020, E103A (11) : 1274 - 1282
  • [27] Parameter-based reduction of Gaussian mixture models with a variational-Bayes approach
    Bruneau, Pierrick
    Gelgon, Marc
    Picarougne, Fabien
    [J]. 19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 144 - 147
  • [28] Entropy-based fade modeling and detection
    San Pedro Wandelmer, Jose
    Dominguez Cabrerizo, Sergio
    Denis, Nicolas
    [J]. JOURNAL OF INFORMATION SCIENCE AND ENGINEERING, 2007, 23 (04) : 1265 - 1280
  • [29] Entropy-Based Subjective Choice Models
    Aggarwal, Manish
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (05) : 2164 - 2175
  • [30] Variational learning for Gaussian mixture models
    Nasios, Nikolaos
    Bors, Adrian G.
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2006, 36 (04): : 849 - 862