Accelerated Online Low-Rank Tensor Learning for Multivariate Spatio-Temporal Streams

被引:0
|
作者
Yu, Rose [1 ]
Cheng, Dehua [1 ]
Liu, Yan [1 ]
机构
[1] Univ Southern Calif, Dept Comp Sci, Los Angeles, CA 90007 USA
关键词
MULTIMODEL ENSEMBLE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank tensor learning has many applications in machine learning. A series of batch learning algorithms have achieved great successes. However, in many emerging applications, such as climate data analysis, we are confronted with large-scale tensor streams, which pose significant challenges to existing solutions. In this paper, we propose an accelerated online low-rank tensor learning algorithm (ALTO) to solve the problem. At each iteration, we project the current tensor to a low-dimensional tensor, using the information of the previous low-rank tensor, in order to perform efficient tensor decomposition, and then recover the low-rank approximation of the current tensor. By randomly selecting additional subspaces, we successfully overcome the issue of local optima at an extremely low computational cost. We evaluate our method on two tasks in online multivariate spatio-temporal analysis: online forecasting and multi-model ensemble. Experiment results show that our method achieves comparable predictive accuracy with significant speed-up.
引用
收藏
页码:238 / 247
页数:10
相关论文
共 50 条
  • [41] Reduced Rank Spatio-Temporal Models
    Pu, Dan
    Fang, Kuangnan
    Lan, Wei
    Yu, Jihai
    Zhang, Qingzhao
    JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 2025, 43 (01) : 98 - 109
  • [42] An online spatio-temporal tensor learning model for visual tracking and its applications to facial expression recognition
    Khan, Sheheryar
    Xu, Guoxia
    Chan, Raymond
    Yan, Hong
    EXPERT SYSTEMS WITH APPLICATIONS, 2017, 90 : 427 - 438
  • [43] An Implementable Accelerated Alternating Direction Method of Multipliers for Low-Rank Tensor Completion
    Qiu, Duo
    Zhu, Hu
    Zhang, Xiongjun
    IEEE 2018 INTERNATIONAL CONGRESS ON CYBERMATICS / 2018 IEEE CONFERENCES ON INTERNET OF THINGS, GREEN COMPUTING AND COMMUNICATIONS, CYBER, PHYSICAL AND SOCIAL COMPUTING, SMART DATA, BLOCKCHAIN, COMPUTER AND INFORMATION TECHNOLOGY, 2018, : 537 - 541
  • [44] Low-Rank Tensor Completion Method for Implicitly Low-Rank Visual Data
    Ji, Teng-Yu
    Zhao, Xi-Le
    Sun, Dong-Lin
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1162 - 1166
  • [45] Iterative tensor eigen rank minimization for low-rank tensor completion
    Su, Liyu
    Liu, Jing
    Tian, Xiaoqing
    Huang, Kaiyu
    Tan, Shuncheng
    INFORMATION SCIENCES, 2022, 616 : 303 - 329
  • [46] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [47] NONPARAMETRIC LOW-RANK TENSOR IMPUTATION
    Bazerque, Juan Andres
    Mateos, Gonzalo
    Giannakis, Georgios B.
    2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 876 - 879
  • [48] Low-Rank Regression with Tensor Responses
    Rabusseau, Guillaume
    Kadri, Hachem
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [49] MULTIRESOLUTION LOW-RANK TENSOR FORMATS
    Mickelin, Oscar
    Karaman, Sertac
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2020, 41 (03) : 1086 - 1114
  • [50] LOW-RANK TENSOR HUBER REGRESSION
    Wei, Yangxin
    Luot, Ziyan
    Chen, Yang
    PACIFIC JOURNAL OF OPTIMIZATION, 2022, 18 (02): : 439 - 458