Probabilistic Tensor Train Decomposition

被引:4
|
作者
Hinrich, Jesper L. [1 ]
Morup, Morten [1 ]
机构
[1] Tech Univ Denmark, Dept Appl Math & Comp Sci, Lyngby, Denmark
关键词
Bayesian inference; tensor train decomposition; matrix product state; multi-modal data;
D O I
10.23919/eusipco.2019.8903177
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The tensor train decomposition (TTD) has become an attractive decomposition approach due to its ease of inference by use of the singular value decomposition and flexible yet compact representations enabling efficient computations and reduced memory usage using the TTD representation for further analyses. Unfortunately, the level of complexity to use and the order in which modes should be decomposed using the TTD is unclear. We advance TTD to a fully probabilistic TTD (PTTD) using variational Bayesian inference to account for parameter uncertainty and noise. In particular, we exploit that the PTTD enables model comparisons by use of the evidence lower bound (ELBO) of the variational approximation. On synthetic data with ground truth structure and a real 3-way fluorescence spectroscopy dataset, we demonstrate how the ELBO admits quantification of model specification not only in terms of numbers of components for each factor in the TTD, but also a suitable order of the modes in which the TTD should be employed. The proposed PTTD provides a principled framework for the characterization of model uncertainty, complexity, and model and mode-order when compressing tensor data using the TTD.
引用
收藏
页数:5
相关论文
共 50 条
  • [41] ALGEBRAIC WAVELET TRANSFORM VIA QUANTICS TENSOR TRAIN DECOMPOSITION
    Oseledets, Ivan V.
    Tyrtyshnikov, Eugene E.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2011, 33 (03): : 1315 - 1328
  • [42] Error-bounded Scalable Parallel Tensor Train Decomposition
    Xie, Shiyao
    Miura, Akinori
    Ono, Kenji
    2023 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS, IPDPSW, 2023, : 345 - 353
  • [43] GTT: Leveraging data characteristics for guiding the tensor train decomposition
    Li, Mao-Lin
    Candan, K. Selcuk
    Sapino, Maria Luisa
    INFORMATION SYSTEMS, 2022, 108
  • [44] Approximation and sampling of multivariate probability distributions in the tensor train decomposition
    Sergey Dolgov
    Karim Anaya-Izquierdo
    Colin Fox
    Robert Scheichl
    Statistics and Computing, 2020, 30 : 603 - 625
  • [45] TT-MLP: Tensor Train Decomposition on Deep MLPs
    Yan, Jiale
    Ando, Kota
    Yu, Jaehoon
    Motomura, Masato
    IEEE ACCESS, 2023, 11 : 10398 - 10411
  • [46] Tensor Train Decomposition on TensorFlow (T3F)
    Novikov, Alexander
    Izmailov, Pavel
    Khrulkov, Valentin
    Figurnov, Michael
    Oseledets, Ivan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [47] TTRISK: Tensor train decomposition algorithm for risk averse optimization
    Antil, Harbir
    Dolgov, Sergey
    Onwunta, Akwum
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2023, 30 (03)
  • [48] Completion of High Order Tensor Data with Missing Entries via Tensor-Train Decomposition
    Yuan, Longhao
    Zhao, Qibin
    Cao, Jianting
    NEURAL INFORMATION PROCESSING, ICONIP 2017, PT I, 2017, 10634 : 222 - 229
  • [49] Scaling Probabilistic Tensor Canonical Polyadic Decomposition to Massive Data
    Cheng, Lei
    Wu, Yik-Chung
    Poor, H. Vincent
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2018, 66 (21) : 5534 - 5548
  • [50] Exploiting tensor rank-one decomposition in probabilistic inference
    Savicky, Petr
    Vomlel, Jiri
    KYBERNETIKA, 2007, 43 (05) : 747 - 764