CANDECOMP/PARAFAC Decomposition of High-Order Tensors Through Tensor Reshaping

被引:34
|
作者
Phan, Anh-Huy [1 ]
Tichavsky, Petr [2 ]
Cichocki, Andrzej [1 ,3 ]
机构
[1] RIKEN, Lab Adv Brain Signal Proc, Brain Sci Inst, Wako, Saitama 3510198, Japan
[2] Acad Sci Czech Republ, Inst Informat Theory & Automat, CR-18208 Prague, Czech Republic
[3] Polish Acad Sci, Syst Res Inst, PL-01447 Warsaw, Poland
关键词
Tensor factorization; canonical decomposition; PARAFAC; ALS; structured CPD; tensor unfolding; Cramer-Rao induced bound (CRIB); Cramer-Rao lower bound (CRLB); UNDERDETERMINED MIXTURES; BLIND IDENTIFICATION; POLYADIC DECOMPOSITION; LEAST-SQUARES; UNIQUENESS; ALGORITHMS; PARAFAC; RANK; APPROXIMATION; COMPLEXITY;
D O I
10.1109/TSP.2013.2269046
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In general, algorithms for order-3 CANDECOMP/PARAFAC (CP), also coined canonical polyadic decomposition (CPD), are easy to implement and can be extended to higher order CPD. Unfortunately, the algorithms become computationally demanding, and they are often not applicable to higher order and relatively large scale tensors. In this paper, by exploiting the uniqueness of CPD and the relation of a tensor in Kruskal form and its unfolded tensor, we propose a fast approach to deal with this problem. Instead of directly factorizing the high order data tensor, the method decomposes an unfolded tensor with lower order, e.g., order-3 tensor. On the basis of the order-3 estimated tensor, a structured Kruskal tensor, of the same dimension as the data tensor, is then generated, and decomposed to find the final solution using fast algorithms for the structured CPD. In addition, strategies to unfold tensors are suggested and practically verified in the paper.
引用
下载
收藏
页码:4847 / 4860
页数:14
相关论文
共 50 条
  • [41] An Optimal High-Order Tensor Method for Convex Optimization
    Jiang, Bo
    Wang, Haoyue
    Zhang, Shuzhong
    CONFERENCE ON LEARNING THEORY, VOL 99, 2019, 99
  • [42] High-Order Coupled Fully Connected Tensor Network Decomposition for Hyperspectral Image Super-Resolution
    Jin, Diyi
    Liu, Jianjun
    Yang, Jinlong
    Wu, Zebin
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [43] Bandwidth Enhancement of Metasurface Antennas by Shifting and Reshaping High-Order Mode
    Qiu, Yonghui
    Weng, Zibin
    Hou, Ding
    Ma, Mingxu
    Liu, Jianfeng
    IEEE ANTENNAS AND WIRELESS PROPAGATION LETTERS, 2023, 22 (04): : 933 - 937
  • [44] High-order sum-of-squares structured tensors: theory and applications
    Chen, Haibin
    Wang, Yiju
    Zhou, Guanglu
    FRONTIERS OF MATHEMATICS IN CHINA, 2020, 15 (02) : 255 - 284
  • [45] Balanced Unfolding Induced Tensor Nuclear Norms for High-Order Tensor Completion
    Qiu, Yuning
    Zhou, Guoxu
    Wang, Andong
    Zhao, Qibin
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 14
  • [46] SPEEDING UP OF KERNEL-BASED LEARNING FOR HIGH-ORDER TENSORS
    Karmouda, Ouafae
    Boulanger, Jeremie
    Boyer, Remy
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2905 - 2909
  • [47] High-order sum-of-squares structured tensors: theory and applications
    Haibin Chen
    Yiju Wang
    Guanglu Zhou
    Frontiers of Mathematics in China, 2020, 15 : 255 - 284
  • [48] A TT-BASED HIERARCHICAL FRAMEWORK FOR DECOMPOSING HIGH-ORDER TENSORS
    Zniyed, Yassine
    Boyer, Remy
    De Almeida, Andre L. F.
    Favier, Gerard
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2020, 42 (02): : A822 - A848
  • [49] Symmetric rank-1 approximation of symmetric high-order tensors
    Wu, Leqin
    Liu, Xin
    Wen, Zaiwen
    OPTIMIZATION METHODS & SOFTWARE, 2020, 35 (02): : 416 - 438
  • [50] A sparse rank-1 approximation algorithm for high-order tensors
    Wang, Yiju
    Dong, Manman
    Xu, Yi
    APPLIED MATHEMATICS LETTERS, 2020, 102