L1-Norm RESCAL Decomposition

被引:0
|
作者
Tsitsikas, Yorgos [1 ]
Chachlakis, Dimitris G. [2 ]
Papalexakis, Evangelos E. [1 ]
Markopoulos, Panos P. [2 ]
机构
[1] Univ Calif Riverside, Riverside, CA 92521 USA
[2] Rochester Inst Technol, Rochester, NY 14623 USA
基金
美国国家科学基金会;
关键词
RESCAL; tensor; decomposition; graph; outlier; L1-norm;
D O I
10.1109/IEEECONF51394.2020.9443401
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multi-way arrays (tensors) can naturally model multi-relational data. RESCAL is a popular tensor-based relational learning model. Despite its documented success, the original RESCAL solver exhibits sensitivity towards outliers, arguably, due to its L2-norm formulation. Absolute-projection RESCAL (A-RESCAL), an L1-norm reformulation of RESCAL, has been proposed as an outlier-resistant alternative. However, although in both cases efficient algorithms have been proposed, they were designed to optimize the factor matrices corresponding to the first and second modes of the data tensor independently. To our knowledge, no formal guarantees have been presented that this treatment can always lead to monotonic convergence of the original non-relaxed RESCAL formulation. To this end, in this work we propose a novel L1-norm based algorithm that solves this problem, which at the same time enjoys robustness features of the same nature as those in A-RESCAL. Additionally, we show that our proposed method is closely related to a heavily studied problem in the optimization literature, which enables it to be equipped with numerical stability and computational efficiency features. Lastly, we present a series of numerical studies on artificial and real-world datasets that corroborate the robustness advantages of the L1-norm formulation as compared to its L2-norm counterpart.
引用
收藏
页码:940 / 944
页数:5
相关论文
共 50 条
  • [1] L1-Norm Tucker Tensor Decomposition
    Chachlakis, Dimitris G.
    Prater-Bennette, Ashley
    Markopoulos, Panos P.
    [J]. IEEE ACCESS, 2019, 7 : 178454 - 178465
  • [2] Dynamic L1-Norm Tucker Tensor Decomposition
    Chachlakis, Dimitris G.
    Dhanaraj, Mayur
    Prater-Bennette, Ashley
    Markopoulos, Panos P.
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2021, 15 (03) : 587 - 602
  • [3] Pruning filters with L1-norm and capped L1-norm for CNN compression
    Kumar, Aakash
    Shaikh, Ali Muhammad
    Li, Yun
    Bilal, Hazrat
    Yin, Baoqun
    [J]. APPLIED INTELLIGENCE, 2021, 51 (02) : 1152 - 1160
  • [4] Pruning filters with L1-norm and capped L1-norm for CNN compression
    Aakash Kumar
    Ali Muhammad Shaikh
    Yun Li
    Hazrat Bilal
    Baoqun Yin
    [J]. Applied Intelligence, 2021, 51 : 1152 - 1160
  • [5] Notes on quantum coherence with l1-norm and convex-roof l1-norm
    Zhu, Jiayao
    Ma, Jian
    Zhang, Tinggui
    [J]. QUANTUM INFORMATION PROCESSING, 2021, 20 (12)
  • [6] Supporting vectors for the l1-norm and the l∞-norm and an application
    Sanchez-Alzola, Alberto
    Garcia-Pacheco, Francisco Javier
    Naranjo-Guerra, Enrique
    Moreno-Pulido, Soledad
    [J]. MATHEMATICAL SCIENCES, 2021, 15 (02) : 173 - 187
  • [7] Linearized alternating directions method for l1-norm inequality constrained l1-norm minimization
    Cao, Shuhan
    Xiao, Yunhai
    Zhu, Hong
    [J]. APPLIED NUMERICAL MATHEMATICS, 2014, 85 : 142 - 153
  • [8] MRPP TESTS IN L1-NORM
    TRACY, DS
    KHAN, KA
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 1987, 5 (04) : 373 - 380
  • [9] L1-norm quantile regression
    Li, Youjuan
    Zhu, Ji
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2008, 17 (01) : 163 - 185
  • [10] Robust Decomposition of 3-way Tensors based on L1-norm
    Chachlakis, Dimitris G.
    Markopoulos, Panos P.
    [J]. COMPRESSIVE SENSING VII: FROM DIVERSE MODALITIES TO BIG DATA ANALYTICS, 2018, 10658