Robust Low-Rank Tensor Decomposition with the L2 Criterion

被引:0
|
作者
Heng, Qiang [1 ]
Chi, Eric C. [2 ]
Liu, Yufeng [3 ]
机构
[1] N Carolina State Univ, Dept Stat, Raleigh, NC 27695 USA
[2] Rice Univ, Dept Stat, Houston, TX 77005 USA
[3] Univ N Carolina, Dept Biostat, Dept Genet, Dept Stat & Operat Res, Chapel Hill, NC 27515 USA
基金
美国国家科学基金会;
关键词
Inverse problem; L-2; criterion; Nonconvexity; Robustness; Tucker decomposition; ALGORITHM; TRANSFORMATION; COMPLETION;
D O I
10.1080/00401706.2023.2200541
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The growing prevalence of tensor data, or multiway arrays, in science and engineering applications motivates the need for tensor decompositions that are robust against outliers. In this article, we present a robust Tucker decomposition estimator based on the L-2 criterion, called the Tucker-L2E. Our numerical experiments demonstrate that Tucker-L2E has empirically stronger recovery performance in more challenging high-rank scenarios compared with existing alternatives. The appropriate Tucker-rank can be selected in a data-driven manner with cross-validation or hold-out validation. The practical effectiveness of Tucker-L2E is validated on real data applications in fMRI tensor denoising, PARAFAC analysis of fluorescence data, and feature extraction for classification of corrupted images.
引用
收藏
页码:537 / 552
页数:16
相关论文
共 50 条
  • [1] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [2] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [3] ROBUST LOW-RANK TENSOR MODELLING USING TUCKER AND CP DECOMPOSITION
    Xue, Niannan
    Papamakarios, George
    Bahri, Mehdi
    Panagakis, Yannis
    Zafeiriou, Stefanos
    2017 25TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2017, : 1185 - 1189
  • [4] Statistical mechanics of low-rank tensor decomposition
    Kadmon, Jonathan
    Ganguli, Surya
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [5] Statistical mechanics of low-rank tensor decomposition
    Kadmon, Jonathan
    Ganguli, Surya
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [6] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [7] Online Robust Low-Rank Tensor Learning
    Li, Ping
    Feng, Jiashi
    Jin, Xiaojie
    Zhang, Luming
    Xu, Xianghua
    Yan, Shuicheng
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2180 - 2186
  • [8] Tensor Denoising Using Low-Rank Tensor Train Decomposition
    Gong, Xiao
    Chen, Wei
    Chen, Jie
    Ai, Bo
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 1685 - 1689
  • [9] A Robust Canonical Polyadic Tensor Decomposition via Structured Low-Rank Matrix Approximation
    Akema, Riku
    Yamagishi, Masao
    Yamada, Isao
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2022, E105A (01) : 11 - 24
  • [10] ROBUST CBCT RECONSTRUCTION BASED ON LOW-RANK TENSOR DECOMPOSITION AND TOTAL VARIATION REGULARIZATION
    Tian, Xin
    Chen, Wei
    Zhao, Fang
    Li, Bo
    Wang, Zhongyuan
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 330 - 334