Static and Streaming Tucker Decomposition for Dense Tensors

被引:3
|
作者
Jang, Jun-Gi [1 ]
Kang, U. [1 ]
机构
[1] Seoul Natl Univ, Comp Sci & Engn, 1 Gwanak Ro, Seoul 08826, South Korea
基金
新加坡国家研究基金会;
关键词
Dense tensor; Tucker decomposition; static setting; online streaming; setting; efficiency; APPROXIMATION; ALGORITHM;
D O I
10.1145/3568682
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Given a dense tensor, howcanwe efficiently discover hidden relations and patterns in static and online streaming settings? Tucker decomposition is a fundamental tool to analyze multidimensional arrays in the form of tensors. However, existing Tucker decomposition methods in both static and online streaming settings have limitations of efficiency since they directly deal with large dense tensors for the result of Tucker decomposition. In a static setting, although few static methods have tried to reduce their time cost by sampling tensors, sketching tensors, and efficient matrix operations, there remains a need for an efficient method. Moreover, streaming versions of Tucker decomposition are still time-consuming to deal with newly arrived tensors. We propose D-Tucker and D-TuckerO, efficient Tucker decomposition methods for large dense tensors in static and online streaming settings, respectively. By decomposing a given large dense tensor with randomized singular value decomposition, avoiding the reconstruction from SVD results, and carefully determining the order of operations, D-Tucker and D-TuckerO efficiently obtain factor matrices and core tensor. Experimental results show that D-Tucker achieves up to 38.4x faster running times, and requires up to 17.2x less space than existing methods while having similar accuracy. Furthermore, D-TuckerO is up to 6.1x faster than existing streaming methods for each newly arrived tensor while its running time is proportional to the size of the newly arrived tensor, not the accumulated tensor.
引用
收藏
页数:34
相关论文
共 50 条
  • [41] Legendre Decomposition for Tensors
    Sugiyama, Mahito
    Nakahara, Hiroyuki
    Tsuda, Koji
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [43] ORTHOGONAL NONNEGATIVE TUCKER DECOMPOSITION
    Pan, Junjun
    Ng, Michael K.
    Liu, Ye
    Zhang, Xiongjun
    Yan, Hong
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2021, 43 (01): : B55 - B81
  • [44] DPar2: Fast and Scalable PARAFAC2 Decomposition for Irregular Dense Tensors
    Jang, Jun-Gi
    Kang, U.
    [J]. 2022 IEEE 38TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2022), 2022, : 2454 - 2467
  • [45] Optimization landscape of Tucker decomposition
    Frandsen, Abraham
    Ge, Rong
    [J]. MATHEMATICAL PROGRAMMING, 2022, 193 (02) : 687 - 712
  • [46] Optimization landscape of Tucker decomposition
    Abraham Frandsen
    Rong Ge
    [J]. Mathematical Programming, 2022, 193 : 687 - 712
  • [47] DYNAMICAL APPROXIMATION BY HIERARCHICAL TUCKER AND TENSOR-TRAIN TENSORS
    Lubich, Christian
    Rohwedder, Thorsten
    Schneider, Reinhold
    Vandereycken, Bart
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2013, 34 (02) : 470 - 494
  • [48] A low-rank isogeometric solver based on Tucker tensors
    Montardini, M.
    Sangalli, G.
    Tani, M.
    [J]. COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2023, 417
  • [49] AUGSPLICING: Synchronized Behavior Detection in Streaming Tensors
    Zhang, Jiabao
    Liu, Shenghua
    Hou, Wenting
    Bhatia, Siddharth
    Shen, Huawei
    Yu, Wenjian
    Cheng, Xueqi
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 4653 - 4661
  • [50] Hierarchical Dense Pattern Detection in Tensors
    Feng, Wenjie
    Liu, Shenghua
    Cheng, Xueqi
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2023, 17 (06)