Multi -Dictionary Tensor Decomposition

被引:0
|
作者
McNeil, Maxwell [1 ]
Bogdanov, Petko [1 ]
机构
[1] SUNY Albany, Comp Sci, Albany, NY 12222 USA
关键词
D O I
10.1109/ICDM58522.2023.00151
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor decomposition methods are popular tools for analysis of multi-way datasets from the social media, healthcare, spatio-temporal domains, and others. Widely adopted models such as Tucker and canonical polyadic decomposition (CPD) follow a data -driven philosophy: they decompose a tensor into factors that approximate the observed data well. In some cases side information is available about the tensor modes. For example, in a temporal user-item purchases tensor a user influence graph, an item similarity graph, and knowledge about seasonality or trends in the temporal mode may be available. Such side information may enable more succinct and interpretable tensor decomposition models and improved quality in downstream tasks. We propose a framework for Multi-Dictionary Tensor Decomposition (MDTD) which takes advantage of prior structural information about tensor modes in the form of coding dictionaries to obtain sparsely coded tensor factors. We derive a general optimization algorithm for MDTD that handles both complete inputs and inputs with missing values. MDTD handles large sparse tensors typical in many real-world application domains. We experimentally demonstrate its utility in both synthetic and realworld datasets. It learns more concise models than dictionary -free counterparts and improves (i) reconstnietion quality (up to 60% smaller models coupled with reduced representation error); (ii) missing values imputation quality (two-fold MSE reduction with up to orders of magnitude time savings) and (iii) the estimation of the tensor rank. MDTD's quality improvements do not come with a running time premium: it can decompose 19GB datasets in less than a minute. It can also impute missing values in sparse billion-entry tensors more accurately and scalable than state-ofthe-art competitors.
引用
收藏
页码:1217 / 1222
页数:6
相关论文
共 50 条
  • [41] Tensor Decomposition with Smoothness
    Imaizumi, Masaaki
    Hayashi, Kohei
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [42] A new tensor decomposition
    I. V. Oseledets
    Doklady Mathematics, 2009, 80 : 495 - 496
  • [43] A new tensor decomposition
    Oseledets, I. V.
    DOKLADY MATHEMATICS, 2009, 80 (01) : 495 - 496
  • [44] Symmetric tensor decomposition
    Brachat, Jerome
    Comon, Pierre
    Mourrain, Bernard
    Tsigaridas, Elias
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2010, 433 (11-12) : 1851 - 1872
  • [45] TENSOR DECOMPOSITION VIA CORE TENSOR NETWORKS
    Zhang, Jianfu
    Tao, Zerui
    Zhang, Liqing
    Zhao, Qibin
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2130 - 2134
  • [46] Tensor gauge condition and tensor field decomposition
    Zhu, Ben-Chao
    Chen, Xiang-Song
    MODERN PHYSICS LETTERS A, 2015, 30 (35)
  • [47] Material Discrimination by Multi-Spectral CT Based on Image Total Variation and Tensor Dictionary
    Chen Peijun
    Feng Peng
    Wu Weiwen
    Wu Xiaochuan
    Fu Xiang
    Wei Biao
    He Peng
    ACTA OPTICA SINICA, 2018, 38 (11)
  • [48] Multi-view collective tensor decomposition for cross-modal hashing
    Cui, Limeng
    Zhang, Jiawei
    He, Lifang
    Yu, Philip S.
    INTERNATIONAL JOURNAL OF MULTIMEDIA INFORMATION RETRIEVAL, 2019, 8 (01) : 47 - 59
  • [49] Multi-modal discrete tensor decomposition hashing for efficient multimedia retrieval
    Wu, Xize
    Zhu, Lei
    Xie, Liang
    Zhang, Zheng
    Zhang, Huaxiang
    NEUROCOMPUTING, 2021, 465 (465) : 1 - 14
  • [50] Consistent Population Synthesis With Multi-Social Relationships Based on Tensor Decomposition
    Ye, Peijun
    Zhu, Fenghua
    Sabri, Samer
    Wang, Fei-Yue
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2020, 21 (05) : 2180 - 2189