Probabilistic Streaming Tensor Decomposition

被引:28
|
作者
Du, Yishuai [1 ]
Zheng, Yimin [1 ]
Lee, Kuang-Chih [2 ]
Zhe, Shandian [1 ]
机构
[1] Univ Utah, Salt Lake City, UT 84112 USA
[2] Alibaba Grp, Hangzhou, Zhejiang, Peoples R China
关键词
tensor data; streaming decomposition; posterior inference;
D O I
10.1109/ICDM.2018.00025
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor decomposition is a fundamental tool for multiway data analysis. While most decomposition algorithms operate a collection of static data and perform batch processes, many applications produce data in a streaming manner - every time a subset of entries are generated, and previously seen entries cannot be revisited. In such scenarios, traditional decomposition approaches will be inappropriate, because they cannot provide timely updates when new data come in, and they need to access the whole dataset many times for batch optimization. To address this issue, we propose POST, a PrObabilistic Streaming Tensor decomposition algorithm, which enables real-time updates and predictions upon receiving new tensor entries, and supports dynamic growth of all the modes. Compared with the state-of-the-art streaming decomposition approach MAST [1], POST is more flexible in that it can handle arbitrary orders of streaming entries, and hence is more widely applicable. In addition, as a Bayesian inference algorithm, POST can quantify the uncertainty of the latent embeddings via their posterior distributions, and the confidence levels of the missing entry value predictions. On several real-world datasets, POST exhibits better or comparable predictive performance than MAST and other static decomposition algorithms.
引用
收藏
页码:99 / 108
页数:10
相关论文
共 50 条
  • [1] The probabilistic tensor decomposition toolbox
    Hinrich, Jesper L.
    Madsen, Kristoffer H.
    Morup, Morten
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2020, 1 (02):
  • [2] Probabilistic Tensor Train Decomposition
    Hinrich, Jesper L.
    Morup, Morten
    2019 27TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2019,
  • [3] Probabilistic Boolean Tensor Decomposition
    Rukat, Tammo
    Holmes, Chris C.
    Yau, Christopher
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [4] Streaming Nonlinear Bayesian Tensor Decomposition
    Pan, Zhimeng
    Wang, Zheng
    Zhe, Shandian
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI 2020), 2020, 124 : 490 - 499
  • [5] High Performance Streaming Tensor Decomposition
    Soh, Yongseok
    Flick, Patrick
    Liu, Xing
    Smith, Shaden
    Checconi, Fabio
    Petrini, Fabrizio
    Choi, Jee
    2021 IEEE 35TH INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM (IPDPS), 2021, : 683 - 692
  • [6] Undirected Probabilistic Model for Tensor Decomposition
    Tao, Zerui
    Tanaka, Toshihisa
    Zhao, Qibin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [7] A Contemporary and Comprehensive Survey on Streaming Tensor Decomposition
    Thanh, Le Trung
    Abed-Meraim, Karim
    Trung, Nguyen Linh
    Hafiane, Adel
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (11) : 10897 - 10921
  • [8] Leveraging Features and Networks for Probabilistic Tensor Decomposition
    Rai, Piyush
    Wang, Yingjian
    Carin, Lawrence
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2942 - 2948
  • [9] Probabilistic Neural-Kernel Tensor Decomposition
    Tillinghast, Conor
    Fang, Shikai
    Zhang, Kai
    Zhe, Shandian
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2020), 2020, : 531 - 540
  • [10] Identifying and Alleviating Concept Drift in Streaming Tensor Decomposition
    Pasricha, Ravdeep
    Gujral, Ekta
    Papalexakis, Evangelos E.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2018, PT II, 2019, 11052 : 327 - 343