A Generalized Model for Robust Tensor Factorization With Noise Modeling by Mixture of Gaussians

被引:54
|
作者
Chen, Xi'ai [1 ,2 ]
Han, Zhi [1 ]
Wang, Yao [3 ]
Zhao, Qian [3 ]
Meng, Deyu [3 ]
Lin, Lin [3 ]
Tang, Yandong [1 ]
机构
[1] Chinese Acad Sci, Shenyang Inst Automat, State Key Lab Robot, Shenyang 110016, Liaoning, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
[3] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Shaanxi, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Expectation-maximization (EM) algorithm; generalized weighted low-rank tensor factorization (GWLRTF); mixture of Gaussians (MoG) model; tensor factorization; DIMENSIONALITY REDUCTION; IMAGES; RANK; RECOGNITION; MOTION;
D O I
10.1109/TNNLS.2018.2796606
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix factorization technique, it can better preserve the intrinsic structure information and thus has a better low-dimensional subspace recovery performance. Basically, the desired low-rank tensor is recovered by minimizing the least square loss between the input data and its factorized representation. Since the least square loss is most optimal when the noise follows a Gaussian distribution, L-1-norm-based methods are designed to deal with outliers. Unfortunately, they may lose their effectiveness when dealing with real data, which are often contaminated by complex noise. In this paper, we consider integrating the noise modeling technique into a generalized weighted LRTF (GWLRTF) procedure. This procedure treats the original issue as an LRTF problem and models the noise using a mixture of Gaussians (MoG), a procedure called MoG GWLRTF. To extend the applicability of the model, two typical tensor factorization operations, i.e., CANDECOMP/PARAFAC factorization and Tucker factorization, are incorporated into the LRTF procedure. Its parameters are updated under the expectation-maximization framework. Extensive experiments indicate the respective advantages of these two versions of MoG GWLRTF in various applications and also demonstrate their effectiveness compared with other competing methods.
引用
收藏
页码:5380 / 5393
页数:14
相关论文
共 50 条
  • [31] An Improved Mixture-of-Gaussians Model for Background Subtraction
    Li, Heng-hui
    Yang, Jin-feng
    Ren, Xiao-hui
    Wu, Ren-biao
    ICSP: 2008 9TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, VOLS 1-5, PROCEEDINGS, 2008, : 1381 - 1384
  • [32] Nonrecurrent traffic congestion detection with a coupled scalable Bayesian robust tensor factorization model
    Li, Qin
    Tan, Huachun
    Jiang, Zhuxi
    Wu, Yuankai
    Ye, Linhui
    NEUROCOMPUTING, 2021, 430 : 138 - 149
  • [33] Parallelization of the Mixture of Gaussians Model for Motion Detection on the GPU
    Kovacev, Petar
    Misic, Marko
    Tomasevic, Milo
    2018 ZOOMING INNOVATION IN CONSUMER TECHNOLOGIES CONFERENCE (ZINC), 2018, : 58 - 61
  • [34] Generalized Coupled Symmetric Tensor Factorization for Link Prediction
    Ermis, Beyza
    Cemgil, A. Taylan
    Acar, Evrim
    2013 21ST SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2013,
  • [35] Numerical method for the generalized nonnegative tensor factorization problem
    Duan, Xue-Feng
    Li, Juan
    Diuan, Shan-Qi
    Wang, Qing-Wen
    NUMERICAL ALGORITHMS, 2021, 87 (02) : 499 - 510
  • [36] Bayesian Robust Tensor Factorization for Incomplete Multiway Data
    Zhao, Qibin
    Zhou, Guoxu
    Zhang, Liqing
    Cichocki, Andrzej
    Amari, Shun-Ichi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (04) : 736 - 748
  • [37] Robust Tensor Factorization Using Maximum Correntropy Criterion
    Zhang, Miaohua
    Gao, Yongsheng
    Sun, Changming
    La Salle, John
    Liang, Junli
    2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 4184 - 4189
  • [38] Numerical method for the generalized nonnegative tensor factorization problem
    Xue-Feng Duan
    Juan Li
    Shan-Qi Duan
    Qing-Wen Wang
    Numerical Algorithms, 2021, 87 : 499 - 510
  • [39] Tensor RPCA by Bayesian CP Factorization with Complex Noise
    Luo, Qiong
    Han, Zhi
    Chen, Xi'ai
    Wang, Yao
    Meng, Deyu
    Liang, Dong
    Tang, Yandong
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 5029 - 5038
  • [40] Keywords: Variance modeling Process and noise variables Generalized linear models Model selection Robust design
    Pinto, Edmilson Rodrigues
    Pereira, Leandro Alves
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2022, 227