Dynamic principal component CAW models for high-dimensional realized covariance matrices

被引:2
|
作者
Gribisch, Bastian [1 ]
Stollenwerk, Michael [2 ]
机构
[1] Univ Cologne, Inst Econometr & Stat, Univ Str 22a, D-50937 Cologne, Germany
[2] Heidelberg Univ, Alfred Weber Inst Econ, Heidelberg, Germany
关键词
Realized volatility; Covariance matrix; Spectral decomposition; Time-series models; ECONOMETRIC-ANALYSIS; LONG-MEMORY; MULTIVARIATE; VOLATILITY; REGRESSION;
D O I
10.1080/14697688.2019.1701197
中图分类号
F8 [财政、金融];
学科分类号
0202 ;
摘要
We propose a new dynamic principal component CAW model (DPC-CAW) for time-series of high-dimensional realized covariance matrices of asset returns (up to 100 assets). The model performs a spectral decomposition of the scale matrix of a central Wishart distribution and assumes independent dynamics for the principal components' variances and the eigenvector processes. A three-step estimation procedure makes the model applicable to high-dimensional covariance matrices. We analyze the finite sample properties of the estimation approach and provide an empirical application to realized covariance matrices for 100 assets. The DPC-CAW model has particularly good forecasting properties and outperforms its competitors for realized covariance matrices.
引用
收藏
页码:799 / 821
页数:23
相关论文
共 50 条
  • [41] Test on the linear combinations of covariance matrices in high-dimensional data
    Bai, Zhidong
    Hu, Jiang
    Wang, Chen
    Zhang, Chao
    STATISTICAL PAPERS, 2021, 62 (02) : 701 - 719
  • [42] Nonasymptotic support recovery for high-dimensional sparse covariance matrices
    Kashlak, Adam B.
    Kong, Linglong
    STAT, 2021, 10 (01):
  • [43] Parallel Computation of High-Dimensional Robust Correlation and Covariance Matrices
    James Chilson
    Raymond Ng
    Alan Wagner
    Ruben Zamar
    Algorithmica, 2006, 45 : 403 - 431
  • [44] HIGH-DIMENSIONAL SPARSE BAYESIAN LEARNING WITHOUT COVARIANCE MATRICES
    Lin, Alexander
    Song, Andrew H.
    Bilgic, Berkin
    Ba, Demba
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 1511 - 1515
  • [45] Robust tests of the equality of two high-dimensional covariance matrices
    Zi, Xuemin
    Chen, Hui
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2022, 51 (10) : 3120 - 3141
  • [46] Block-diagonal test for high-dimensional covariance matrices
    Jiayu Lai
    Xiaoyi Wang
    Kaige Zhao
    Shurong Zheng
    TEST, 2023, 32 : 447 - 466
  • [47] Principal component analysis for sparse high-dimensional data
    Raiko, Tapani
    Ilin, Alexander
    Karhunen, Juha
    NEURAL INFORMATION PROCESSING, PART I, 2008, 4984 : 566 - 575
  • [48] High-dimensional principal component analysis with heterogeneous missingness
    Zhu, Ziwei
    Wang, Tengyao
    Samworth, Richard J.
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2022, 84 (05) : 2000 - 2031
  • [49] PRINCIPAL COMPONENT ANALYSIS IN VERY HIGH-DIMENSIONAL SPACES
    Lee, Young Kyung
    Lee, Eun Ryung
    Park, Byeong U.
    STATISTICA SINICA, 2012, 22 (03) : 933 - 956
  • [50] Test for high-dimensional outliers with principal component analysis
    Nakayama, Yugo
    Yata, Kazuyoshi
    Aoshima, Makoto
    JAPANESE JOURNAL OF STATISTICS AND DATA SCIENCE, 2024, 7 (02) : 739 - 766