tensor with scalable amortized inference

被引:2
|
作者
Tao, Zerui [1 ,2 ]
Tanaka, Toshihisa [1 ,2 ]
Zhao, Qibin [1 ,2 ]
机构
[1] Tokyo Univ Agr & Technol, Dept Elect & Informat Engn, Tokyo 1848588, Japan
[2] RIKEN Ctr Adv Intelligence Project AIP, Tokyo 1030027, Japan
关键词
Tensor decomposition; Tensor completion; Gaussian process; Variational auto-encoder; Data imputation; FACTORIZATION;
D O I
10.1016/j.neunet.2023.10.031
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-dimensional data are common in many applications, such as videos and multi-variate time series. While tensor decomposition (TD) provides promising tools for analyzing such data, there still remains several limitations. First, traditional TDs assume multi-linear structures of the latent embeddings, which greatly limits their expressive power. Second, TDs cannot be straightforwardly applied to datasets with massive samples. To address these issues, we propose a nonparametric TD with amortized inference networks. Specifically, we establish a non-linear extension of tensor ring decomposition, using neural networks, to model complex latent structures. To jointly model the cross-sample correlations and physical structures, a matrix Gaussian process (GP) prior is imposed over the core tensors. From learning perspective, we develop a VAE-like amortized inference network to infer the posterior of core tensors corresponding to new tensor data, which enables TDs to be applied to large datasets. Our model can be also viewed as a kind of decomposition of VAE, which can additionally capture hidden tensor structure and enhance the expressiveness power. Finally, we derive an evidence lower bound such that a scalable optimization algorithm is developed. The advantages of our method have been evaluated extensively by data imputation on the Healing MNIST dataset and four multi-variate time series data.
引用
收藏
页码:431 / 441
页数:11
相关论文
共 50 条
  • [31] Detecting Model Misspecification in Amortized Bayesian Inference with Neural Networks
    Schmitt, Marvin
    Buerkner, Paul-Christian
    Koethe, Ullrich
    Radev, Stefan T.
    PATTERN RECOGNITION, DAGM GCPR 2023, 2024, 14264 : 541 - 557
  • [32] Energy-efficient Amortized Inference with Cascaded Deep Classifiers
    Guan, Jiaqi
    Liu, Yang
    Liu, Qiang
    Peng, Jian
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2184 - 2190
  • [33] Neural Amortized Inference for Nested Multi-Agent Reasoning
    Jha, Kunal
    Tuan Anh Le
    Jin, Chuanyang
    Kuo, Yen-Ling
    Tenenbaum, Joshua B.
    Shu, Tianmin
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 1, 2024, : 530 - 537
  • [34] Scalable Tensor Mining
    Sael, Lee
    Jeon, Inah
    Kang, U.
    BIG DATA RESEARCH, 2015, 2 (02) : 82 - 86
  • [35] A Simple and Scalable Static Analysis for Bound Analysis and Amortized Complexity Analysis
    Sinn, Moritz
    Zuleger, Florian
    Veith, Helmut
    COMPUTER AIDED VERIFICATION, CAV 2014, 2014, 8559 : 745 - 761
  • [36] Generalized Bayesian Inference for Scientific Simulators via Amortized Cost Estimation
    Gao, Richard
    Deistler, Michael
    Macke, Jakob H.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [37] Sequential Monte Carlo for Inclusive KL Minimization in Amortized Variational Inference
    McNamara, Declan
    Loper, Jackson
    Regier, Jeffrey
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [38] Amortized simulation-based frequentist inference for tractable and intractable likelihoods
    Al Kadhim, Ali
    Prosper, Harrison B.
    Prosper, Olivia F.
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2024, 5 (01):
  • [39] Scalable Nonparametric Tensor Analysis
    Zhe, Shandian
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 5058 - 5059
  • [40] Reliable amortized variational inference with physics-based latent distribution correction
    Siahkoohi, Ali
    Rizzuti, Gabrio
    Orozco, Rafael
    Herrmann, Felix J.
    GEOPHYSICS, 2023, 88 (03) : R297 - R322