GTT: Guiding the Tensor Train Decomposition

被引:3
|
作者
Li, Mao-Lin [1 ]
Candan, K. Selcuk [1 ]
Sapino, Maria Luisa [2 ]
机构
[1] Arizona State Univ, Tempe, AZ 85281 USA
[2] Univ Turin, Turin, Italy
基金
美国国家科学基金会;
关键词
Low-rank embedding; Tensor train decomposition;
D O I
10.1007/978-3-030-60936-8_15
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The demand for searching, querying multimedia data such as image, video and audio is omnipresent, how to effectively access data for various applications is a critical task. Nevertheless, these data usually are encoded as multi-dimensional arrays, or Tensor, and traditional data mining techniques might be limited due to the curse of dimensionality. Tensor decomposition is proposed to alleviate this issue, commonly used tensor decomposition algorithms include CP-decomposition (which seeks a diagonal core) and Tucker-decomposition (which seeks a dense core). Naturally, Tucker maintains more information, but due to the denseness of the core, it also is subject to exponential memory growth with the number of tensor modes. Tensor train (TT) decomposition addresses this problem by seeking a sequence of three-mode cores: but unfortunately, currently, there are no guidelines to select the decomposition sequence. In this paper, we propose a GTT method for guiding the tensor train in selecting the decomposition sequence. GTT leverages the data characteristics (including number of modes, length of the individual modes, density, distribution of mutual information, and distribution of entropy) as well as the target decomposition rank to pick a decomposition order that will preserve information. Experiments with various data sets demonstrate that GTT effectively guides the TT-decomposition process towards decomposition sequences that better preserve accuracy.
引用
收藏
页码:187 / 202
页数:16
相关论文
共 50 条
  • [1] GTT: Leveraging data characteristics for guiding the tensor train decomposition
    Li, Mao-Lin
    Candan, K. Selcuk
    Sapino, Maria Luisa
    INFORMATION SYSTEMS, 2022, 108
  • [2] Probabilistic Tensor Train Decomposition
    Hinrich, Jesper L.
    Morup, Morten
    2019 27TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2019,
  • [3] TENSOR-TRAIN DECOMPOSITION
    Oseledets, I. V.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2011, 33 (05): : 2295 - 2317
  • [4] SPECTRAL TENSOR-TRAIN DECOMPOSITION
    Bigoni, Daniele
    Engsig-Karup, Allan P.
    Marzouk, Youssef M.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2016, 38 (04): : A2405 - A2439
  • [5] GRAPH REGULARIZED TENSOR TRAIN DECOMPOSITION
    Sofuoglu, Seyyid Emre
    Aviyente, Selin
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3912 - 3916
  • [6] AN INCREMENTAL TENSOR TRAIN DECOMPOSITION ALGORITHM
    Aksoy, Doruk
    Gorsich, David J.
    Veerapaneni, Shravan
    Gorodetsky, Alex A.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2024, 46 (02): : A1047 - A1075
  • [7] Faster tensor train decomposition for sparse data
    Li, Lingjie
    Yu, Wenjian
    Batselier, Kim
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2022, 405
  • [8] Tensor-Train decomposition for image recognition
    Brandoni, D.
    Simoncini, V
    CALCOLO, 2020, 57 (01)
  • [9] Optimizing the Order of Modes in Tensor Train Decomposition
    Tichavsky, Petr
    Straka, Ondrej
    IEEE SIGNAL PROCESSING LETTERS, 2025, 32 : 1361 - 1365
  • [10] Tensor-Train decomposition for image recognition
    D. Brandoni
    V. Simoncini
    Calcolo, 2020, 57