Online Robust Low-Rank Tensor Learning

被引:0
|
作者
Li, Ping [1 ,2 ]
Feng, Jiashi [2 ]
Jin, Xiaojie [2 ]
Zhang, Luming [3 ]
Xu, Xianghua [1 ]
Yan, Shuicheng [2 ,4 ]
机构
[1] Hangzhou Dianzi Univ, Sch Comp Sci & Technol, Hangzhou, Peoples R China
[2] Natl Univ Singapore, Sch Comp Sci & Technol, Singapore, Singapore
[3] Hefei Univ Technol, Dept Comp & Informat, Hefei, Peoples R China
[4] Qihoo 360 Artificial Intelligence Inst, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The rapid increase of multidimensional data (a.k.a. tensor) like videos brings new challenges for low-rank data modeling approaches such as dynamic data size, complex high-order relations, and multiplicity of low-rank structures. Resolving these challenges require a new tensor analysis method that can perform tensor data analysis online, which however is still absent. In this paper, we propose an Online Robust Low-rank Tensor Modeling (ORLTM) approach to address these challenges. ORLTM dynamically explores the high-order correlations across all tensor modes for low-rank structure modeling. To analyze mixture data from multiple subspaces, ORLTM introduces a new dictionary learning component. ORLTM processes data streamingly and thus requires quite low memory cost that is independent of data size. This makes ORLTM quite suitable for processing large-scale tensor data. Empirical studies have validated the effectiveness of the proposed method on both synthetic data and one practical task, i.e., video background subtraction. In addition, we provide theoretical analysis regarding computational complexity and memory cost, demonstrating the efficiency of ORLTM rigorously.
引用
收藏
页码:2180 / 2186
页数:7
相关论文
共 50 条
  • [41] Learning Tensor Low-Rank Prior for Hyperspectral Image Reconstruction
    Zhang, Shipeng
    Wang, Lizhi
    Zhang, Lei
    Huang, Hua
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 12001 - 12010
  • [42] Tensor Convolutional Dictionary Learning With CP Low-Rank Activations
    Humbert, Pierre
    Oudre, Laurent
    Vayatis, Nicolas
    Audiffren, Julien
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 785 - 796
  • [43] Learning Tensor Low-Rank Representation for Hyperspectral Anomaly Detection
    Wang, Minghua
    Wang, Qiang
    Hong, Danfeng
    Roy, Swalpa Kumar
    Chanussot, Jocelyn
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (01) : 679 - 691
  • [44] Robust Projective Low-Rank and Sparse Representation by Robust Dictionary Learning
    Ren, Jiahuan
    Zhang, Zhao
    Li, Sheng
    Liu, Guangcan
    Wang, Meng
    Yan, Shuicheng
    [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1851 - 1856
  • [45] Low-Rank Tensor Completion Method for Implicitly Low-Rank Visual Data
    Ji, Teng-Yu
    Zhao, Xi-Le
    Sun, Dong-Lin
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1162 - 1166
  • [46] Iterative tensor eigen rank minimization for low-rank tensor completion
    Su, Liyu
    Liu, Jing
    Tian, Xiaoqing
    Huang, Kaiyu
    Tan, Shuncheng
    [J]. INFORMATION SCIENCES, 2022, 616 : 303 - 329
  • [47] NONPARAMETRIC LOW-RANK TENSOR IMPUTATION
    Bazerque, Juan Andres
    Mateos, Gonzalo
    Giannakis, Georgios B.
    [J]. 2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 876 - 879
  • [48] Low-Rank Regression with Tensor Responses
    Rabusseau, Guillaume
    Kadri, Hachem
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [49] MULTIRESOLUTION LOW-RANK TENSOR FORMATS
    Mickelin, Oscar
    Karaman, Sertac
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2020, 41 (03) : 1086 - 1114
  • [50] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28