Low-rank tensor ring learning for multi-linear regression

被引:8
|
作者
Liu, Jiani [1 ]
Zhu, Ce [1 ]
Long, Zhen [1 ]
Huang, Huyan [1 ]
Liu, Yipeng [1 ]
机构
[1] Univ Elect Sci & Technol China UESTC, Sch Informat & Commun Engn, Xiyuan Ave 2006, Chengdu 611731, Peoples R China
基金
中国国家自然科学基金;
关键词
Multilinear regression; Ridge regression; Tensor ring decomposition; MONTE-CARLO ALGORITHMS; DIMENSIONALITY REDUCTION; APPROXIMATION; OPTIMIZATION; RECOGNITION; FRAMEWORK;
D O I
10.1016/j.patcog.2020.107753
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The emergence of large-scale data demands new regression models with multi-dimensional coefficient arrays, known as tensor regression models. The recently proposed tensor ring decomposition has interesting properties of enhanced representation and compression capability, cyclic permutation invariance and balanced tensor ring rank, which may lead to efficient computation and fewer parameters in regression problems. In this paper, a generally multi-linear tensor-on-tensor regression model is proposed that the coefficient array has a low-rank tensor ring structure, which is termed tensor ring ridge regression (TRRR). Two optimization models are developed for the TRRR problem and solved by different algorithms: the tensor factorization based one is solved by alternating least squares algorithm, and accelerated by a fast network contraction, while the rank minimization based one is addressed by the alternating direction method of multipliers algorithm. Comparative experiments, including Spatio-temporal forecasting tasks and 3D reconstruction of human motion capture data from its temporally synchronized video sequences, demonstrate the enhanced performance of our algorithms over existing state-of-the-art ones, especially in terms of training time. (c) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Fast Recursive Low-rank Tensor Learning for Regression
    Hou, Ming
    Chaib-draa, Brahim
    [J]. PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1851 - 1857
  • [2] On the difference between low-rank and subspace approximation: improved model for multi-linear PLS regression
    Bro, R
    Smilde, AK
    de Jong, S
    [J]. CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2001, 58 (01) : 3 - 13
  • [3] Low-Rank Regression with Tensor Responses
    Rabusseau, Guillaume
    Kadri, Hachem
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [4] LOW-RANK TENSOR HUBER REGRESSION
    Wei, Yangxin
    Luot, Ziyan
    Chen, Yang
    [J]. PACIFIC JOURNAL OF OPTIMIZATION, 2022, 18 (02): : 439 - 458
  • [5] Low Rank Variational Tensor Recovery for Multi-Linear Inverse Problems
    Alqadah, Hatim F.
    Fan, Howard
    [J]. 2011 CONFERENCE RECORD OF THE FORTY-FIFTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS (ASILOMAR), 2011, : 1438 - 1442
  • [6] Low-Rank Tensor Thresholding Ridge Regression
    Guo, Kailing
    Zhang, Tong
    Xu, Xiangmin
    Xing, Xiaofen
    [J]. IEEE ACCESS, 2019, 7 : 153761 - 153772
  • [7] Low-rank Tensor Regression: Scalability and Applications
    Liu, Yan
    [J]. 2017 IEEE 7TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP), 2017,
  • [8] Boosted Sparse and Low-Rank Tensor Regression
    He, Lifang
    Chen, Kun
    Xu, Wanwan
    Zhou, Jiayu
    Wang, Fei
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [9] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [10] TENSOR QUANTILE REGRESSION WITH LOW-RANK TENSOR TRAIN ESTIMATION
    Liu, Zihuan
    Lee, Cheuk Yin
    Zhang, Heping
    [J]. ANNALS OF APPLIED STATISTICS, 2024, 18 (02): : 1294 - 1318