Probabilistic Tensor Train Decomposition

被引:4
|
作者
Hinrich, Jesper L. [1 ]
Morup, Morten [1 ]
机构
[1] Tech Univ Denmark, Dept Appl Math & Comp Sci, Lyngby, Denmark
关键词
Bayesian inference; tensor train decomposition; matrix product state; multi-modal data;
D O I
10.23919/eusipco.2019.8903177
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The tensor train decomposition (TTD) has become an attractive decomposition approach due to its ease of inference by use of the singular value decomposition and flexible yet compact representations enabling efficient computations and reduced memory usage using the TTD representation for further analyses. Unfortunately, the level of complexity to use and the order in which modes should be decomposed using the TTD is unclear. We advance TTD to a fully probabilistic TTD (PTTD) using variational Bayesian inference to account for parameter uncertainty and noise. In particular, we exploit that the PTTD enables model comparisons by use of the evidence lower bound (ELBO) of the variational approximation. On synthetic data with ground truth structure and a real 3-way fluorescence spectroscopy dataset, we demonstrate how the ELBO admits quantification of model specification not only in terms of numbers of components for each factor in the TTD, but also a suitable order of the modes in which the TTD should be employed. The proposed PTTD provides a principled framework for the characterization of model uncertainty, complexity, and model and mode-order when compressing tensor data using the TTD.
引用
收藏
页数:5
相关论文
共 50 条
  • [21] Distributed Non-Negative Tensor Train Decomposition
    Bhattarai, Manish
    Chennupati, Gopinath
    Skau, Erik
    Vangara, Raviteja
    Djidjev, Hirsto
    Alexandrov, Boian S.
    2020 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2020,
  • [22] Distributed and Randomized Tensor Train Decomposition for Feature Extraction
    Fonal, Krzysztof
    Zdunek, Rafal
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [23] Block Tensor Train Decomposition for Missing Value Imputation
    Lee, Namgil
    2018 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2018, : 1338 - 1343
  • [24] Block tensor train decomposition for missing data estimation
    Namgil Lee
    Jong-Min Kim
    Statistical Papers, 2018, 59 : 1283 - 1305
  • [25] Block tensor train decomposition for missing data estimation
    Lee, Namgil
    Kim, Jong-Min
    STATISTICAL PAPERS, 2018, 59 (04) : 1283 - 1305
  • [26] Active Fault Detection Based on Tensor Train Decomposition
    Puncochar, Ivo
    Straka, Ondrej
    Tichaysk, Petr
    IFAC PAPERSONLINE, 2024, 58 (04): : 676 - 681
  • [27] PARALLEL ALGORITHMS FOR COMPUTING THE TENSOR-TRAIN DECOMPOSITION
    Shi, Tianyi
    Ruth, Maximilian
    Townsend, Alex
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2023, 45 (03): : C101 - C130
  • [28] Nimble GNN Embedding with Tensor-Train Decomposition
    Yin, Chunxing
    Zheng, Da
    Nisa, Israt
    Faloutsos, Christos
    Karypis, George
    Vuduc, Richard
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 2327 - 2335
  • [29] Probabilistic Tensor Decomposition of Neural Population Spiking Activity
    Soulat, Hugo
    Keshavarzi, Sepiedeh
    Margrie, Troy W.
    Sahani, Maneesh
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [30] Probabilistic Tensor Canonical Polyadic Decomposition With Orthogonal Factors
    Cheng, Lei
    Wu, Yik-Chung
    Poor, H. Vincent
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (03) : 663 - 676