Contextual tensor decomposition by projected alternating least squares

被引:0
|
作者
Zhang, Nan [1 ]
Liu, Yanshuo [1 ]
Yang, Jichen [2 ]
机构
[1] Fudan Univ, Sch Data Sci, Shanghai, Peoples R China
[2] Wanjia Asset Management Co Ltd, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
Contextual dimension reduction; Low-rank approximation; Singular value decomposition; Tensor decomposition; REGRESSION; UNIQUENESS; RANK;
D O I
10.1080/03610918.2023.2196748
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Tensor decomposition is among the most important tools for low-rank structure extraction in tensor data analysis. In many scenarios, contextual covariates from various domains are available, which can potentially drive the underlying structures of the observed data. The classical CANDECOMP/PARAFAC (CP) decomposition method only focuses on the relational information among objects and cannot make use of such additional information. In this article, we propose a contextual CP tensor decomposition framework which incorporates the additional covariate information to infer the intrinsic low-rank structure more accurately. Moreover, an algorithm called projected alternating least squares is proposed for parameter estimation. In particular, the projection procedure removes noise effectively from observations and improves the estimation accuracy when the dependence on covariates genuinely exists. We demonstrate the advantages of our proposal with comprehensive simulations and real data analysis on GDELT political events study. The empirical results show that our method offers an improved modeling strategy as well as efficient computation.
引用
下载
收藏
页数:16
相关论文
共 50 条
  • [1] Practical alternating least squares for tensor ring decomposition
    Yu, Yajie
    Li, Hanyu
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2024, 31 (03)
  • [2] Blockwise acceleration of alternating least squares for canonical tensor decomposition
    Evans, David
    Ye, Nan
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2023, 30 (06)
  • [3] Accelerating alternating least squares for tensor decomposition by pairwise perturbation
    Ma, Linjian
    Solomonik, Edgar
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2022, 29 (04)
  • [4] Partitioned Alternating Least Squares Technique for Canonical Polyadic Tensor Decomposition
    Tichavsky, Petr
    Anh-Huy Phan
    Cichocki, Andrzej
    IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (07) : 993 - 997
  • [5] PARTITIONED HIERARCHICAL ALTERNATING LEAST SQUARES ALGORITHM FOR CP TENSOR DECOMPOSITION
    Anh-Huy Phan
    Tichavsky, Petr
    Cichocki, Andrzej
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 2542 - 2546
  • [6] A seminorm regularized alternating least squares algorithm for canonical tensor decomposition
    Chen, Yannan
    Sun, Wenyu
    Xi, Min
    Yuan, Jinyun
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2019, 347 : 296 - 313
  • [7] Some convergence results on the Regularized Alternating Least-Squares method for tensor decomposition
    Li, Na
    Kindermann, Stefan
    Navasca, Carmeliza
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2013, 438 (02) : 796 - 812
  • [8] A self-adaptive regularized alternating least squares method for tensor decomposition problems
    Mao, Xianpeng
    Yuan, Gonglin
    Yang, Yuning
    ANALYSIS AND APPLICATIONS, 2020, 18 (01) : 129 - 147
  • [9] On global convergence of alternating least squares for tensor approximation
    Yuning Yang
    Computational Optimization and Applications, 2023, 84 : 509 - 529
  • [10] Fused Orthogonal Alternating Least Squares for Tensor Clustering
    Wang, Jiacheng
    Nicolae, Dan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,