The Kullback information criterion for mixture regression models

被引:4
|
作者
Hafidi, Bezza [1 ]
Mkhadri, Abdallah [2 ]
机构
[1] Univ Ibno Zohr, Fac Sci, Agadir, Morocco
[2] Univ Cadi Ayyad, Fac Sci Semlalia, Marrakech, Morocco
关键词
SYMMETRIC DIVERGENCE; TIME-SERIES; SELECTION; LIKELIHOOD;
D O I
10.1016/j.spl.2010.01.014
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We consider the problem of jointly selecting the number of components and variables in finite mixture regression models. The classical model selection criterion, AIC or BIC, may not be satisfactory in this setting, especially when the sample size is small or the number of variables is large. Specifically, they fit too many components and retain too many variables. An alternative mixture regression criterion, called MRC, which simultaneously determines the number of components and variables in mixture regression models, was proposed by Naik et al. (2007). In the same setting, we propose a new information criterion, called MRC(sd), for the simultaneous determination of the number of components and predictors. MRC(sd) is based on the Kullback symmetric divergence instead of the Kullback directed divergence used for MRC. We show that the new criterion performs well than MRC in a small simulation study. (C) 2010 Elsevier B.V. All rights reserved.
引用
收藏
页码:807 / 815
页数:9
相关论文
共 50 条
  • [1] Extending the akaike information criterion to mixture regression models
    Naik, Prasad A.
    Shi, Peide
    Tsai, Chih-Ling
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2007, 102 (477) : 244 - 254
  • [2] Smoothing parameter selection in nonparametric regression using an improved Kullback Information Criterion
    Bekara, M
    Hafidi, B
    Fleury, G
    [J]. ISSPA 2005: THE 8TH INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND ITS APPLICATIONS, VOLS 1 AND 2, PROCEEDINGS, 2005, : 887 - 890
  • [3] Performance of Akaike Information Criterion and Bayesian Information Criterion in Selecting Partition Models and Mixture Models
    Liu, Qin
    Charleston, Michael A.
    Richards, Shane A.
    Holland, Barbara R.
    [J]. SYSTEMATIC BIOLOGY, 2023, 72 (01) : 92 - 105
  • [4] Kullback-Leibler Divergence and Akaike Information Criterion in General Hidden Markov Models
    Fuh, Cheng-Der
    Kao, Chu-Lan Michael
    Pang, Tianxiao
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (08) : 5888 - 5909
  • [5] An Akaike information criterion for multiple event mixture cure models
    Dirick, Lore
    Claeskens, Gerda
    Baesens, Bart
    [J]. EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2015, 241 (02) : 449 - 457
  • [6] Unifying the Derivations of Kullback Information Criterion and Corrected Versions
    Keerativibool, Warangkhana
    [J]. THAILAND STATISTICIAN, 2014, 12 (01): : 37 - 53
  • [7] Consistent Bayesian information criterion based on a mixture prior for possibly high-dimensional multivariate linear regression models
    Kono, Haruki
    Kubokawa, Tatsuya
    [J]. SCANDINAVIAN JOURNAL OF STATISTICS, 2023, 50 (03) : 1022 - 1047
  • [8] Parameter identifiability with Kullback-Leibler information divergence criterion
    Chen, Badong
    Hu, Jinchun
    Zhu, Yu
    Sun, Zengqi
    [J]. INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 940 - 960
  • [9] Variable selection using a smooth information criterion for distributional regression models
    Meadhbh O’Neill
    Kevin Burke
    [J]. Statistics and Computing, 2023, 33 (3)
  • [10] Quantile regression models with factor-augmented predictors and information criterion
    Ando, Tomohiro
    Tsay, Ruey S.
    [J]. ECONOMETRICS JOURNAL, 2011, 14 (01): : 1 - 24