PRINCIPAL COMPONENT ANALYSIS FOR SECOND-ORDER STATIONARY VECTOR TIME SERIES

被引:29
|
作者
Chang, Jinyuan [1 ,2 ]
Guo, Bin [1 ,2 ]
Yao, Qiwei [3 ]
机构
[1] Southwestern Univ Finance & Econ, Sch Stat, Chengdu 611130, Sichuan, Peoples R China
[2] Southwestern Univ Finance & Econ, Ctr Stat Res, Chengdu 611130, Sichuan, Peoples R China
[3] London Sch Econ & Polit Sci, Dept Stat, London WC2A 2AE, England
来源
ANNALS OF STATISTICS | 2018年 / 46卷 / 05期
基金
英国工程与自然科学研究理事会;
关键词
alpha-mixing; autocorrelation; cross-correlation; dimension reduction; eigenanalysis; high-dimensional time series; weak stationarity; LATENT FACTORS; NUMBER; MODEL;
D O I
10.1214/17-AOS1613
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We extend the principal component analysis (PCA) to second-order stationary vector time series in the sense that we seek for a contemporaneous linear transformation for a p-variate time series such that the transformed series is segmented into several lower-dimensional subseries, and those subseries are uncorrelated with each other both contemporaneously and serially. Therefore, those lower-dimensional series can be analyzed separately as far as the linear dynamic structure is concerned. Technically, it boils down to an eigenanalysis for a positive definite matrix. When p is large, an additional step is required to perform a permutation in terms of either maximum cross-correlations or FDR based on multiple tests. The asymptotic theory is established for both fixed p and diverging p when the sample size n tends to infinity. Numerical experiments with both simulated and real data sets indicate that the proposed method is an effective initial step in analyzing multiple time series data, which leads to substantial dimension reduction in modelling and forecasting high-dimensional linear dynamical structures. Unlike PCA for independent data, there is no guarantee that the required linear transformation exists. When it does not, the proposed method provides an approximate segmentation which leads to the advantages in, for example, forecasting for future values. The method can also be adapted to segment multiple volatility processes.
引用
收藏
页码:2094 / 2124
页数:31
相关论文
共 50 条
  • [21] Principal component analysis for non-stationary time series based on detrended cross-correlation analysis
    Zhao, Xiaojun
    Shang, Pengjian
    NONLINEAR DYNAMICS, 2016, 84 (02) : 1033 - 1044
  • [22] APPLICATION OF PRINCIPAL COMPONENT ANALYSIS TO TIME SERIES MODELING
    STEWART, JR
    BIOMETRICS, 1972, 28 (01) : 272 - &
  • [23] Functional principal component analysis of financial time series
    Ingrassia, S
    Costanzo, GD
    New Developments in Classification and Data Analysis, 2005, : 351 - 358
  • [24] Efficient Schur Parametrization and Modeling of p-Stationary Second-Order Time-Series for LPC Transmission
    Wielgus, Agnieszka
    Zarzycki, Jan
    INTERNATIONAL JOURNAL OF ELECTRONICS AND TELECOMMUNICATIONS, 2018, 64 (03) : 343 - 350
  • [25] Second-Order Analysis for the Time Crisis Problem
    Bayen, Terence
    Pfeiffer, Laurent
    JOURNAL OF CONVEX ANALYSIS, 2020, 27 (01) : 139 - 163
  • [26] Second-order moving average and scaling of stochastic time series
    Alessio, E
    Carbone, A
    Castelli, G
    Frappietro, V
    EUROPEAN PHYSICAL JOURNAL B, 2002, 27 (02): : 197 - 200
  • [27] Second-order moving average and scaling of stochastic time series
    E. Alessio
    A. Carbone
    G. Castelli
    V. Frappietro
    The European Physical Journal B - Condensed Matter and Complex Systems, 2002, 27 : 197 - 200
  • [28] Second-order Confidence Network for Early Classification of Time Series
    Lv, Junwei
    Chu, Yuqi
    Hu, Jun
    Li, Peipei
    Hu, Xuegang
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2024, 15 (01)
  • [29] Estimation of second-order properties from jittered time series
    Thomson, PJ
    Robinson, PM
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 1996, 48 (01) : 29 - 48
  • [30] Robust principal component analysis via weighted nuclear norm with modified second-order total variation regularization
    Dou, Yi
    Liu, Xinling
    Zhou, Min
    Wang, Jianjun
    VISUAL COMPUTER, 2023, 39 (08): : 3495 - 3505