Tensor Regression Using Low-Rank and Sparse Tucker Decompositions

被引:11
|
作者
Ahmed, Talal [1 ]
Raja, Haroon [2 ]
Bajwa, Waheed U. [1 ]
机构
[1] Rutgers State Univ, Dept Elect & Comp Engn, Piscataway, NJ 08854 USA
[2] Univ Michigan, Dept Elect Engn & Comp Sci, Ann Arbor, MI 48109 USA
来源
基金
美国国家科学基金会;
关键词
linear regression; sample complexity; sparsity; tensor regression; Tucker decomposition; FMRI; REGULARIZATION; CLASSIFICATION; SELECTION; RECOVERY;
D O I
10.1137/19M1299335
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper studies a tensor-structured linear regression model with a scalar response variable and tensor-structured predictors, such that the regression parameters form a tensor of order d (i.e., a d-fold multiway array) in R-n1 x n2x ... x nd In particular, we focus on the task of estimating the regression tensor from m realizations of the response variable and the predictors where m << n = Pi(i)n(i). Despite the seeming ill-posedness of this estimation problem, it can still be solved if the parameter tensor belongs to the space of sparse, low Tucker-rank tensors. Accordingly, the estimation procedure is posed as a nonconvex optimization program over the space of sparse, low Tucker-rank tensors, and a tensor variant of projected gradient descent is proposed to solve the resulting nonconvex problem. In addition, mathematical guarantees are provided that establish that the proposed method linearly converges to an appropriate solution under a certain set of conditions. Further, an upper bound on sample complexity of tensor parameter estimation for the model under consideration is characterized for the special case when the individual (scalar) predictors independently draw values from a subGaussian distribution. The sample complexity bound is shown to have a polylogarithmic dependence on (n) over bar = max {n(i) : i is an element of {1, 2,. . ., d}}; orderwise, it matches the bound one can obtain from a heuristic parameter counting argument. Finally, numerical experiments demonstrate the efficacy of the proposed tensor model and estimation method on a synthetic dataset and a collection of neuroimaging datasets pertaining to attention deficit hyperactivity disorder (ADHD). Specifically, the proposed method exhibits better sample complexities on both synthetic and real datasets, demonstrating the usefulness of the model and the method in settings where n >> m.
引用
收藏
页码:944 / 966
页数:23
相关论文
共 50 条
  • [1] Randomized Algorithms for Low-Rank Tensor Decompositions in the Tucker Format
    Minster, Rachel
    Saibaba, Arvind K.
    Kilmer, Misha E.
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (01): : 189 - 215
  • [2] Boosted Sparse and Low-Rank Tensor Regression
    He, Lifang
    Chen, Kun
    Xu, Wanwan
    Zhou, Jiayu
    Wang, Fei
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Learning Fast Dictionaries for Sparse Representations Using Low-Rank Tensor Decompositions
    Dantas, Cassio F.
    Cohen, Jeremy E.
    Gribonval, Remi
    [J]. LATENT VARIABLE ANALYSIS AND SIGNAL SEPARATION (LVA/ICA 2018), 2018, 10891 : 456 - 466
  • [4] A low-rank and sparse enhanced Tucker decomposition approach for tensor completion
    Pan, Chenjian
    Ling, Chen
    He, Hongjin
    Qi, Liqun
    Xu, Yanwei
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2024, 465
  • [5] Sparse and Low-Rank Matrix Decompositions
    Chandrasekaran, Venkat
    Sanghavi, Sujay
    Parrilo, Pablo A.
    Willsky, Alan S.
    [J]. 2009 47TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING, VOLS 1 AND 2, 2009, : 962 - +
  • [6] Robust Tensor CUR Decompositions: Rapid Low-Tucker-Rank Tensor Recovery with Sparse Corruptions
    Cai, HanQin
    Chao, Zehan
    Huang, Longxiu
    Needell, Deanna
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2024, 17 (01): : 225 - 247
  • [7] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    [J]. PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [8] Accelerated Low-rank Updates to Tensor Decompositions
    Baskaran, Muthu
    Langston, M. Harper
    Ramananandro, Tahina
    Bruns-Smith, David
    Henretty, Tom
    Ezick, James
    Lethin, Richard
    [J]. 2016 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2016,
  • [9] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [10] ROBUST LOW-RANK TENSOR MODELLING USING TUCKER AND CP DECOMPOSITION
    Xue, Niannan
    Papamakarios, George
    Bahri, Mehdi
    Panagakis, Yannis
    Zafeiriou, Stefanos
    [J]. 2017 25TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2017, : 1185 - 1189