Algorithms for sparse nonnegative Tucker decompositions

被引:111
|
作者
Morup, Morten [1 ]
Hansen, Lars Kai [1 ]
Arnfred, Sidse M. [2 ]
机构
[1] Tech Univ Denmark, DK-2800 Kongens Lyngby, Denmark
[2] Univ Copenhagen Hosp, Hvidovre Hosp, Dept Psychiat, Copenhagen, Denmark
关键词
D O I
10.1162/neco.2008.11-06-407
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There is a increasing interest in analysis of large-scale multiway data. The concept of multiway data refers to arrays of data with more than two dimensions, that is, taking the form of tensors. To analyze such data, decomposition techniques are widely used. The two most common decompositions for tensors are the Tucker model and the more restricted PARAFAC model. Both models can be viewed as generalizations of the regular factor analysis to data of more than two modalities. Nonnegative matrix factorization (NMF), in conjunction with sparse coding, has recently been given much attention due to its part-based and easy interpretable representation. While NMF has been extended to the PARAFAC model, no such attempt has been done to extend NMF to the Tucker model. However, if the tensor data analyzed are nonnegative, it may well be relevant to consider purely additive (i.e., nonnegative) Tucker decompositions). To reduce ambiguities of this type of decomposition, we develop updates that can impose sparseness in any combination of modalities, hence, proposed algorithms for sparse nonnegative Tucker decompositions (SN-TUCKER). We demonstrate how the proposed algorithms are superior to existing algorithms for Tucker decompositions when the data and interactions can be considered nonnegative. We further illustrate how sparse coding can help identify what model (PARAFAC or Tucker) is more appropriate for the data as well as to select the number of components by turning off excess components. The algorithms for SN-TUCKER can be downloaded from Morup (2007).
引用
收藏
页码:2112 / 2131
页数:20
相关论文
共 50 条
  • [1] Efficient Nonnegative Tucker Decompositions: Algorithms and Uniqueness
    Zhou, Guoxu
    Cichocki, Andrzej
    Zhao, Qibin
    Xie, Shengli
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (12) : 4990 - 5003
  • [2] Fast and Efficient Algorithms for Nonnegative Tucker Decomposition
    Phan, Anh Huy
    Cichocki, Andrzej
    [J]. ADVANCES IN NEURAL NETWORKS - ISNN 2008, PT 2, PROCEEDINGS, 2008, 5264 : 772 - 782
  • [3] Alternating proximal gradient method for sparse nonnegative Tucker decomposition
    Xu, Yangyang
    [J]. MATHEMATICAL PROGRAMMING COMPUTATION, 2015, 7 (01) : 39 - 70
  • [4] Randomized algorithms for the approximations of Tucker and the tensor train decompositions
    Che, Maolin
    Wei, Yimin
    [J]. ADVANCES IN COMPUTATIONAL MATHEMATICS, 2019, 45 (01) : 395 - 428
  • [5] Randomized algorithms for the approximations of Tucker and the tensor train decompositions
    Maolin Che
    Yimin Wei
    [J]. Advances in Computational Mathematics, 2019, 45 : 395 - 428
  • [6] Multifactor sparse feature extraction using Convolutive Nonnegative Tucker Decomposition
    Wu, Qiang
    Zhang, Liqing
    Cichocki, Andrzej
    [J]. NEUROCOMPUTING, 2014, 129 : 17 - 24
  • [7] Tensor Regression Using Low-Rank and Sparse Tucker Decompositions
    Ahmed, Talal
    Raja, Haroon
    Bajwa, Waheed U.
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (04): : 944 - 966
  • [8] Scalable Tucker Factorization for Sparse Tensors - Algorithms and Discoveries
    Oh, Sejoon
    Park, Namyong
    Sael, Lee
    Kang, U.
    [J]. 2018 IEEE 34TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE), 2018, : 1120 - 1131
  • [9] Randomized Algorithms for Low-Rank Tensor Decompositions in the Tucker Format
    Minster, Rachel
    Saibaba, Arvind K.
    Kilmer, Misha E.
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (01): : 189 - 215
  • [10] Nonnegative tucker decomposition
    Kim, Yong-Deok
    Choi, Seungjin
    [J]. 2007 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-8, 2007, : 3104 - +