Optimal dimension reduction for high-dimensional and functional time series

被引:4
|
作者
Hallin M. [1 ,2 ]
Hörmann S. [1 ,2 ,3 ]
Lippi M. [4 ]
机构
[1] ECARES, Université libre de Bruxelles, Brussels
[2] Département de Mathématique, Université libre de Bruxelles, Brussels
[3] Institute for Statistics, Graz University of Technology, Graz
[4] Einaudi Institute for Economics and Finance, Rome
关键词
Dimension reduction; Dynamic principal components; Functional principal components; Karhunen–Loève expansion; Principal components; Time series;
D O I
10.1007/s11203-018-9172-1
中图分类号
学科分类号
摘要
Dimension reduction techniques are at the core of the statistical analysis of high-dimensional and functional observations. Whether the data are vector- or function-valued, principal component techniques, in this context, play a central role. The success of principal components in the dimension reduction problem is explained by the fact that, for any K≤ p, the K first coefficients in the expansion of a p-dimensional random vector X in terms of its principal components is providing the best linear K-dimensional summary of X in the mean square sense. The same property holds true for a random function and its functional principal component expansion. This optimality feature, however, no longer holds true in a time series context: principal components and functional principal components, when the observations are serially dependent, are losing their optimal dimension reduction property to the so-called dynamic principal components introduced by Brillinger in 1981 in the vector case and, in the functional case, their functional extension proposed by Hörmann, Kidziński and Hallin in 2015. © 2018, Springer Science+Business Media B.V., part of Springer Nature.
引用
收藏
页码:385 / 398
页数:13
相关论文
共 50 条
  • [41] Factor Models for High-Dimensional Tensor Time Series
    Chen, Rong
    Yang, Dan
    Zhang, Cun-Hui
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2022, 117 (537) : 94 - 116
  • [42] High-dimensional quantile varying-coefficient models with dimension reduction
    Zhao, Weihua
    Li, Rui
    Lian, Heng
    [J]. METRIKA, 2022, 85 (01) : 1 - 19
  • [43] High-dimensional local polynomial regression with variable selection and dimension reduction
    Kin Yap Cheung
    Stephen M. S. Lee
    [J]. Statistics and Computing, 2024, 34
  • [44] DYNAMICAL COMPONENT ANALYSIS (DYCA): DIMENSIONALITY REDUCTION FOR HIGH-DIMENSIONAL DETERMINISTIC TIME-SERIES
    Seifert, Bastian
    Korn, Katharina
    Hartmann, Steffen
    Uhl, Christian
    [J]. 2018 IEEE 28TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2018,
  • [45] Optimal Linear Discriminant Analysis for High-Dimensional Functional Data
    Xue, Kaijie
    Yang, Jin
    Yao, Fang
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024, 119 (546) : 1055 - 1064
  • [46] DIMENSION REDUCTION IN TIME SERIES
    Park, Jin-Hong
    Sriram, T. N.
    Yin, Xiangrong
    [J]. STATISTICA SINICA, 2010, 20 (02) : 747 - 770
  • [47] High-dimensional functional time series forecasting: An application to age-specific mortality rates
    Gao, Yuan
    Shang, Han Lin
    Yang, Yanrong
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2019, 170 : 232 - 243
  • [48] DIMENSION MEASUREMENT ON HIGH-DIMENSIONAL SYSTEMS
    GERSHENFELD, NA
    [J]. PHYSICA D, 1992, 55 (1-2): : 135 - 154
  • [49] A novel dimension reduction and dictionary learning framework for high-dimensional data classification
    Li, Yanxia
    Chai, Yi
    Zhou, Han
    Yin, Hongpeng
    [J]. PATTERN RECOGNITION, 2021, 112
  • [50] Progression-Preserving Dimension Reduction for High-Dimensional Sensor Data Visualization
    Yoon, Hyunjin
    Shahabi, Cyrus
    Winstein, Carolee J.
    Jang, Jong-Hyun
    [J]. ETRI JOURNAL, 2013, 35 (05) : 911 - 914