Accurate Tensor Completion via Adaptive Low-Rank Representation

被引:15
|
作者
Zhang, Lei [1 ]
Wei, Wei [2 ,3 ,4 ]
Shi, Qinfeng [5 ,6 ]
Shen, Chunhua [5 ,6 ]
van den Hengel, Anton [5 ,6 ]
Zhang, Yanning [2 ,3 ,4 ]
机构
[1] Incept Inst Artificial Intelligence IIAI, Abu Dhabi, U Arab Emirates
[2] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Peoples R China
[3] Northwestern Polytech Univ, Natl Engn Lab Integrated Aerosp Ground Ocean Big, Xian 710072, Peoples R China
[4] Northwestern Polytech Univ, Shaanxi Prov Key Lab Speech & Image Informat Proc, Xian 710072, Peoples R China
[5] Univ Adelaide, Sch Comp Sci, Adelaide, SA 5005, Australia
[6] Australian Inst Machine Learning, Adelaide, SA 5005, Australia
基金
中国国家自然科学基金;
关键词
Tensors; Adaptation models; Data models; Bayes methods; Learning systems; Computer science; Australia; Adaptive low-rank representation; automatic tensor rank determination; tensor completion; SPARSITY; FACTORIZATION;
D O I
10.1109/TNNLS.2019.2952427
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank representation-based approaches that assume low-rank tensors and exploit their low-rank structure with appropriate prior models have underpinned much of the recent progress in tensor completion. However, real tensor data only approximately comply with the low-rank requirement in most cases, viz., the tensor consists of low-rank (e.g., principle part) as well as non-low-rank (e.g., details) structures, which limit the completion accuracy of these approaches. To address this problem, we propose an adaptive low-rank representation model for tensor completion that represents low-rank and non-low-rank structures of a latent tensor separately in a Bayesian framework. Specifically, we reformulate the CANDECOMP/PARAFAC (CP) tensor rank and develop a sparsity-induced prior for the low-rank structure that can be used to determine tensor rank automatically. Then, the non-low-rank structure is modeled using a mixture of Gaussians prior that is shown to be sufficiently flexible and powerful to inform the completion process for a variety of real tensor data. With these two priors, we develop a Bayesian minimum mean-squared error estimate framework for inference. The developed framework can capture the important distinctions between low-rank and non-low-rank structures, thereby enabling more accurate model, and ultimately, completion. For various applications, compared with the state-of-the-art methods, the proposed model yields more accurate completion results.
引用
收藏
页码:4170 / 4184
页数:15
相关论文
共 50 条
  • [21] A dual framework for low-rank tensor completion
    Nimishakavi, Madhav
    Jawanpuria, Pratik
    Mishra, Bamdev
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [22] Adaptive Low-Rank Matrix Completion
    Tripathi, Ruchi
    Mohan, Boda
    Rajawat, Ketan
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (14) : 3603 - 3616
  • [23] Low-Rank Tensor Completion via Tensor Nuclear Norm With Hybrid Smooth Regularization
    Zhao, Xi-Le
    Nie, Xin
    Zheng, Yu-Bang
    Ji, Teng-Yu
    Huang, Ting-Zhu
    [J]. IEEE ACCESS, 2019, 7 : 131888 - 131901
  • [24] Higher-dimension Tensor Completion via Low-rank Tensor Ring Decomposition
    Yuan, Longhao
    Cao, Jianting
    Zhao, Xuyang
    Wu, Qiang
    Zhao, Qibin
    [J]. 2018 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2018, : 1071 - 1076
  • [25] On the equivalence between low-rank matrix completion and tensor rank
    Derksen, Harm
    [J]. LINEAR & MULTILINEAR ALGEBRA, 2018, 66 (04): : 645 - 667
  • [26] A Weighted Tensor Factorization Method for Low-rank Tensor Completion
    Cheng, Miaomiao
    Jing, Liping
    Ng, Michael K.
    [J]. 2019 IEEE FIFTH INTERNATIONAL CONFERENCE ON MULTIMEDIA BIG DATA (BIGMM 2019), 2019, : 30 - 38
  • [27] Low-Rank Tensor Completion Based on Self-Adaptive Learnable Transforms
    Wu, Tongle
    Gao, Bin
    Fan, Jicong
    Xue, Jize
    Woo, W. L.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 8826 - 8838
  • [28] Adaptive System Identification via Low-Rank Tensor Decomposition
    Auer, Christina
    Ploder, Oliver
    Paireder, Thomas
    Kovacs, Peter
    Lang, Oliver
    Huemer, Mario
    [J]. IEEE ACCESS, 2021, 9 (09): : 139028 - 139042
  • [29] Learning Low-Rank Representation for Matrix Completion
    Kwon, Minsu
    Choi, Ho-Jin
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP 2020), 2020, : 161 - 164
  • [30] Low-rank tensor completion via combined non-local self-similarity and low-rank regularization
    Li, Xiao-Tong
    Zhao, Xi-Le
    Jiang, Tai-Xiang
    Zheng, Yu-Bang
    Ji, Teng-Yu
    Huang, Ting-Zhu
    [J]. NEUROCOMPUTING, 2019, 367 : 1 - 12