Modeling and Learning on High-Dimensional Matrix-Variate Sequences

被引:0
|
作者
Zhang, Xu [1 ,2 ]
Liu, Catherine C. [2 ]
Guo, Jianhua [3 ]
Yuen, K. C. [4 ]
Welsh, A. H. [5 ]
机构
[1] South China Normal Univ, Guangzhou, Peoples R China
[2] Hong Kong Polytech Univ, Hung Hom, Hong Kong, Peoples R China
[3] Beijing Technol & Business Univ, Beijing, Peoples R China
[4] Univ Hong Kong, Hong Kong, Peoples R China
[5] Australian Natl Univ, Canberra, Australia
基金
澳大利亚研究理事会; 中国国家自然科学基金;
关键词
Image reconstruction; Matrix factor model; Peak signal-to-noise ratio; Rank decomposition; Separable covariance structure; Tensor subspace; TENSOR DECOMPOSITIONS; NUMBER; PCA;
D O I
10.1080/01621459.2024.2344687
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We propose a new matrix factor model, named RaDFaM, which is strictly derived from the general rank decomposition and assumes a high-dimensional vector factor model structure for each basis vector. RaDFaM contributes a novel class of low-rank latent structures that trade off between signal intensity and dimension reduction from a tensor subspace perspective. Based on the intrinsic separable covariance structure of RaDFaM, for a collection of matrix-valued observations, we derive a new class of PCA variants for estimating loading matrices, and sequentially the latent factor matrices. The peak signal-to-noise ratio of RaDFaM is proved to be superior in the category of PCA-type estimators. We also establish an asymptotic theory including the consistency, convergence rates, and asymptotic distributions for components in the signal part. Numerically, we demonstrate the performance of RaDFaM in applications such as matrix reconstruction, supervised learning, and clustering, on uncorrelated and correlated data, respectively. Supplementary materials for this article are available online, including a standardized description of the materials available for reproducing the work.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] HIGH-DIMENSIONAL SEMIPARAMETRIC ESTIMATE OF LATENT COVARIANCE MATRIX FOR MATRIX-VARIATE
    Niu, Lu
    Zhao, Junlong
    [J]. STATISTICA SINICA, 2019, 29 (03) : 1535 - 1559
  • [2] Statistical Inference for High-Dimensional Matrix-Variate Factor Models
    Chen, Elynn Y.
    Fan, Jianqing
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023, 118 (542) : 1038 - 1055
  • [3] Constrained Factor Models for High-Dimensional Matrix-Variate Time Series
    Chen, Elynn Y.
    Tsay, Ruey S.
    Chen, Rong
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2020, 115 (530) : 775 - 793
  • [4] Robust estimator of the correlation matrix with sparse Kronecker structure for a high-dimensional matrix-variate
    Niu, Lu
    Liu, Xiumin
    Zhao, Junlong
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2020, 177
  • [5] Testing Kronecker product covariance matrices for high-dimensional matrix-variate data
    Yu, Long
    Xie, Jiahui
    Zhou, Wang
    [J]. BIOMETRIKA, 2023, 110 (03) : 799 - 814
  • [6] Low-rank latent matrix-factor prediction modeling for generalized high-dimensional matrix-variate regression
    Zhang, Yuzhe
    Zhang, Xu
    Zhang, Hong
    Liu, Aiyi
    Liu, Catherine C.
    [J]. STATISTICS IN MEDICINE, 2023, 42 (20) : 3616 - 3635
  • [7] A Portmanteau Local Feature Discrimination Approach to the Classification with High-dimensional Matrix-variate Data
    Zengchao Xu
    Shan Luo
    Zehua Chen
    [J]. Sankhya A, 2023, 85 : 441 - 467
  • [8] A Portmanteau Local Feature Discrimination Approach to the Classification with High-dimensional Matrix-variate Data
    Xu, Zengchao
    Luo, Shan
    Chen, Zehua
    [J]. SANKHYA-SERIES A-MATHEMATICAL STATISTICS AND PROBABILITY, 2023, 85 (01): : 441 - 467
  • [9] Low and high dimensional wavelet thresholds for matrix-variate normal distribution
    Karamikabir, H.
    Sanati, A.
    Hamedani, G. G.
    [J]. COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2024,
  • [10] Some properties of matrix-variate Laplace transforms and matrix-variate Whittaker functions
    Mathai, AM
    Pederzoli, G
    [J]. LINEAR ALGEBRA AND ITS APPLICATIONS, 1997, 253 : 209 - 226