Fast Nonnegative Tensor Factorization for Very Large-Scale Problems Using Two-Stage Procedure

被引:0
|
作者
Phan, Anh Huy [1 ]
Cichocki, Andrzej [1 ]
机构
[1] RIKEN, Brain Sci Inst, Lab Adv Brain Signal Proc, Wako, Saitama 3510198, Japan
关键词
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Parallel factor analysis (PARAFAC) is a multi-way decomposition method which allows to find hidden factors from the raw tensor data. Recently, the nonnegative tensor factorization (NTF), a variant of the model with nonnegativity constraints imposed on hidden factors has attracted interesting due to meaningful representation with many potential applications in neuroscience, bioinformatics, chemometrics etc [1], [2]. NTF algorithms can be easily extended from algorithms for nonnegative matrix factorization (NMF) by forming learning rules on the unfolding tensor [1], [3]. However, they often compute Khatri-Rao products of factors which lead to large matrices, and require large memory for temporal variables. Hence decomposition of large-scale tensor is still a challenging problem for NTF. PARAFAC by alternating least squares (ALS) can explain the raw tensor by a small number of rank-one tensor with a high fitness. Based on this advantage, we propose a new fast NTF algorithm which factorizes the approximate tensor obtained from the PARAFAC. Our new algorithm computes Hadamard products, therefore it is extremely fast in comparison with all the existing NTF algorithms. Extensive experiments confirm the validity, high performance and high speed of the developed algorithm.
引用
收藏
页码:297 / 300
页数:4
相关论文
共 50 条
  • [21] Lagrangian Decomposition for large-scale two-stage stochastic mixed 0-1 problems
    L. F. Escudero
    M. A. Garín
    G. Pérez
    A. Unzueta
    [J]. TOP, 2012, 20 : 347 - 374
  • [22] Fast Tensor Factorization for Large-Scale Context-Aware Recommendation from Implicit Feedback
    Chou, Szu-Yu
    Jang, Jyh-Shing Roger
    Yang, Yi-Hsuan
    [J]. IEEE TRANSACTIONS ON BIG DATA, 2020, 6 (01) : 201 - 208
  • [23] Secure and Verifiable Outsourcing of Large-Scale Nonnegative Matrix Factorization (NMF)
    Duan, Jia
    Zhou, Jiantao
    Li, Yuanman
    [J]. IEEE TRANSACTIONS ON SERVICES COMPUTING, 2021, 14 (06) : 1940 - 1953
  • [24] Sparse LSSVM in Primal Using Cholesky Factorization for Large-Scale Problems
    Zhou, Shuisheng
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (04) : 783 - 795
  • [25] Fast Local Algorithms for Large Scale Nonnegative Matrix and Tensor Factorizations
    Cichocki, Andrzej
    Phan, Anh-Huy
    [J]. IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2009, E92A (03) : 708 - 721
  • [26] Large-scale tucker Tensor factorization for sparse and accurate decomposition
    Jun-Gi Jang
    Moonjeong Park
    Jongwuk Lee
    Lee Sael
    [J]. The Journal of Supercomputing, 2022, 78 : 17992 - 18022
  • [27] Large-scale tucker Tensor factorization for sparse and accurate decomposition
    Jang, Jun-Gi
    Park, Moonjeong
    Lee, Jongwuk
    Sael, Lee
    [J]. JOURNAL OF SUPERCOMPUTING, 2022, 78 (16): : 17992 - 18022
  • [28] Community discovery in large-scale complex networks using distributed SimRank nonnegative matrix factorization
    He, Chaobo
    Fei, Xiang
    Li, Hanchao
    Liu, Hai
    Tang, Yong
    Chen, Qimai
    [J]. 2017 FIFTH INTERNATIONAL CONFERENCE ON ADVANCED CLOUD AND BIG DATA (CBD), 2017, : 226 - 231
  • [29] Enhancing Hyperspectral Unmixing With Two-Stage Multiplicative Update Nonnegative Matrix Factorization
    Sun, Li
    Zhao, Kang
    Han, Congying
    Liu, Ziwen
    [J]. IEEE ACCESS, 2019, 7 : 171023 - 171031
  • [30] Two-Stage Sparse Representation for Robust Recognition on Large-Scale Database
    He, Ran
    Hu, BaoGang
    Zheng, Wei-Shi
    Guo, YanQing
    [J]. PROCEEDINGS OF THE TWENTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE (AAAI-10), 2010, : 475 - 480