Regularization and selection in Gaussian mixture of autoregressive models

被引:9
|
作者
Khalili, Abbas [1 ]
Chen, Jiahua [2 ]
Stephens, David A. [1 ]
机构
[1] McGill Univ, Dept Math & Stat, Montreal, PQ, Canada
[2] Univ British Columbia, Dept Stat, Vancouver, BC, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Autoregressive models; information criteria; lasso; mixture models; scad; MSC 2010: Primary 62F10; 62F12; secondary; 62J07; 62M10; VARIABLE SELECTION; LIKELIHOOD; SHRINKAGE; INFERENCE; VARIANCE; LASSO;
D O I
10.1002/cjs.11332
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Gaussian mixtures of autoregressive models can be adopted to explain heterogeneous behaviour in mean, volatility, and multi-modality of the conditional or marginal distributions of time series. One important task is to infer the number of autoregressive regimes and the autoregressive orders. Information-theoretic criteria such as aic or bic are commonly used for such inference, and typically evaluate each regime/autoregressive combination separately in order to choose an optimal model. However the number of combinations can be so large that such an approach is computationally infeasible. In this article we first develop a computationally efficient regularization method for simultaneous autoregressive-order and parameter estimation when the number of autoregressive regimes is pre-determined. We then propose a regularized Bayesian information criterion (rbic) to select the number of regimes. We study asymptotic properties of the proposed methods, and investigate their finite sample performance via simulations. We show that asymptotically the rbic does not underestimate the number of autoregressive regimes, and provide a discussion on the current challenges in investigating whether and under what conditions the rbic provides a consistent estimator of the number of regimes. We finally analyze U.S. gross domestic product growth and unemployment rate data to demonstrate the proposed methods. The Canadian Journal of Statistics 45: 356-374; 2017 (c) 2017 Statistical Society of Canada
引用
收藏
页码:356 / 374
页数:19
相关论文
共 50 条
  • [31] Deep Gaussian mixture models
    Cinzia Viroli
    Geoffrey J. McLachlan
    [J]. Statistics and Computing, 2019, 29 : 43 - 51
  • [32] Autoregressive density modeling with the Gaussian process mixture transition distribution
    Heiner, Matthew
    Kottas, Athanasios
    [J]. JOURNAL OF TIME SERIES ANALYSIS, 2022, 43 (02) : 157 - 177
  • [33] Parsimonious Gaussian mixture models
    Paul David McNicholas
    Thomas Brendan Murphy
    [J]. Statistics and Computing, 2008, 18 : 285 - 296
  • [34] Fuzzy Gaussian Mixture Models
    Ju, Zhaojie
    Liu, Honghai
    [J]. PATTERN RECOGNITION, 2012, 45 (03) : 1146 - 1158
  • [35] Deep Gaussian mixture models
    Viroli, Cinzia
    McLachlan, Geoffrey J.
    [J]. STATISTICS AND COMPUTING, 2019, 29 (01) : 43 - 51
  • [36] Combining Gaussian mixture models
    Lee, HJ
    Cho, S
    [J]. INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING IDEAL 2004, PROCEEDINGS, 2004, 3177 : 666 - 671
  • [37] Feature selection for pattern classification with Gaussian mixture models: A new objective criterion
    Krishnan, S
    Samudravijaya, K
    Rao, PVS
    [J]. PATTERN RECOGNITION LETTERS, 1996, 17 (08) : 803 - 809
  • [38] Parsimonious Gaussian mixture models
    McNicholas, Paul David
    Murphy, Thomas Brendan
    [J]. STATISTICS AND COMPUTING, 2008, 18 (03) : 285 - 296
  • [39] Fast Forward Feature Selection of Hyperspectral Images for Classification With Gaussian Mixture Models
    Fauvel, Mathieu
    Dechesne, Clement
    Zullo, Anthony
    Ferraty, Frederic
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2015, 8 (06) : 2824 - 2831
  • [40] Mixture periodic autoregressive time series models
    Shao, Q
    [J]. STATISTICS & PROBABILITY LETTERS, 2006, 76 (06) : 609 - 618