Subspace perspective on canonical correlation analysis: Dimension reduction and minimax rates

被引:5
|
作者
Ma, Zhuang [1 ]
Li, Xiaodong [2 ]
机构
[1] Univ Penn, Wharton Sch, Dept Stat, 3730 Walnut St,Suite 400, Philadelphia, PA 19104 USA
[2] Univ Calif Davis, Dept Stat, Davis, CA 95616 USA
关键词
canonical correlation analysis; dimension reduction; minimax rates;
D O I
10.3150/19-BEJ1131
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Canonical correlation analysis (CCA) is a fundamental statistical tool for exploring the correlation structure between two sets of random variables. In this paper, motivated by the recent success of applying CCA to learn low dimensional representations of high dimensional objects, we propose two losses based on the principal angles between the model spaces spanned by the sample canonical variates and their population correspondents, respectively. We further characterize the non-asymptotic error bounds for the estimation risks under the proposed error metrics, which reveal how the performance of sample CCA depends adaptively on key quantities including the dimensions, the sample size, the condition number of the covariance matrices and particularly the population canonical correlation coefficients. The optimality of our uniform upper bounds is also justified by lower-bound analysis based on stringent and localized parameter spaces. To the best of our knowledge, for the first time our paper separates p(1) and p(2) for the first order term in the upper bounds without assuming the residual correlations are zeros. More significantly, our paper derives (1 - lambda(2)(k))( 1 - lambda(2)(k+1))/(lambda(k) - lambda(k+1))(2) for the first time in the non-asymptotic CCA estimation convergence k+1 rates, which is essential to understand the behavior of CCA when the leading canonical correlation coefficients are close to 1.
引用
收藏
页码:432 / 470
页数:39
相关论文
共 50 条
  • [31] Subspace Estimation with Automatic Dimension and Variable Selection in Sufficient Dimension Reduction
    Zeng, Jing
    Mai, Qing
    Zhang, Xin
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024, 119 (545) : 343 - 355
  • [32] A Shrinkage Estimation of Central Subspace in Sufficient Dimension Reduction
    Wang, Qin
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2010, 39 (10) : 1868 - 1876
  • [33] Sampling-Based Dimension Reduction for Subspace Approximation
    Deshpande, Amit
    Varadarajan, Kasturi
    STOC 07: PROCEEDINGS OF THE 39TH ANNUAL ACM SYMPOSIUM ON THEORY OF COMPUTING, 2007, : 641 - 650
  • [34] Linear Subspace Learning via Sparse Dimension Reduction
    Yin, Ming
    Guo, Yi
    Gao, Junbin
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 3540 - 3547
  • [35] Sufficient dimension reduction through informative predictor subspace
    Yoo, Jae Keun
    STATISTICS, 2016, 50 (05) : 1086 - 1099
  • [36] Fused Estimators of the Central Subspace in Sufficient Dimension Reduction
    Cook, R. Dennis
    Zhang, Xin
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2014, 109 (506) : 815 - 827
  • [37] ON THE BENEFITS OF SUBSPACE DIMENSION REDUCTION FOR GRASSMANNIAN MULTIUSER BEAMFORMERS
    Xiong, Zhilan
    Cumanan, Kanapathippillai
    Lambotharan, Sangarapillai
    2009 IEEE/SP 15TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2009, : 657 - 660
  • [38] Minimax rates for sparse signal detection under correlation
    Kotekal, Subhodh
    Gao, Chao
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2023, 12 (04)
  • [39] A randomized exponential canonical correlation analysis method for data analysis and dimensionality reduction
    Wu, Gang
    Li, Fei
    APPLIED NUMERICAL MATHEMATICS, 2021, 164 : 101 - 124
  • [40] CROSS LINGUAL SPEECH EMOTION RECOGNITION USING CANONICAL CORRELATION ANALYSIS ON PRINCIPAL COMPONENT SUBSPACE
    Sagha, Hesam
    Deng, Jun
    Gavryukova, Maryna
    Han, Jing
    Schuller, Bjoern
    2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 5800 - 5804