Accuracy-Preserving and Scalable Column-Based Low-Rank Matrix Approximation

被引:0
|
作者
Wu, Jiangang [1 ]
Liao, Shizhong [1 ]
机构
[1] Tianjin Univ, Sch Comp Sci & Technol, Tianjin 300072, Peoples R China
关键词
Low-rank matrix approximation; Divide-and-conquer; Scalability; Machine learning; FACE RECOGNITION; ALGORITHMS;
D O I
10.1007/978-3-319-25159-2_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Column-based low-rank matrix approximation is a useful method to analyze and interpret data in machine learning and data mining. However existing methods will face some accuracy and scalability problems when dealing with large-scale data. In this paper we propose a new parallel framework for column-based low-rank matrix approximation based on divide-and-conquer strategy. It consists of three stages: (1) Dividing the original matrix into several small submatrices. (2) Performing column-based low-rank matrix approximation to select columns on each submatrix in parallel. (3) Combining these columns into the final result. We prove that the new parallel framework has (1+epsilon) relative-error upper bound. We also show that it is more scalable than existing work. The results of comparison experiments and application in kernel methods demonstrate the effectiveness and efficiency of our method on both synthetic and real world datasets.
引用
收藏
页码:236 / 247
页数:12
相关论文
共 50 条
  • [21] Low-rank matrix approximation in the infinity norm
    Gillis, Nicolas
    Shitov, Yaroslav
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2019, 581 (367-382) : 367 - 382
  • [22] LLORMA : Local Low-Rank Matrix Approximation
    Lee, Joonseok
    Kim, Seungyeon
    Lebanon, Guy
    Singer, Yoram
    Bengio, Samy
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [23] A Schur method for low-rank matrix approximation
    vanderVeen, AJ
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 1996, 17 (01) : 139 - 160
  • [24] Low-Rank Matrix Approximation with Manifold Regularization
    Zhang, Zhenyue
    Zhao, Keke
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (07) : 1717 - 1729
  • [25] Scalable Algorithms for Locally Low-Rank Matrix Modeling
    Gu, Qilong
    Trzasko, Joshua D.
    Banerjee, Arindam
    2017 17TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2017, : 137 - 146
  • [26] Similarity Measure based on Low-Rank Approximation for Highly Scalable Recommender Systems
    Seifzadeh, Sepideh
    Miri, Ali
    2015 IEEE TRUSTCOM/BIGDATASE/ISPA, VOL 2, 2015, : 66 - 71
  • [27] Approximation Schemes for Low-rank Binary Matrix Approximation Problems
    Fomin, Fedor, V
    Golovach, Petr A.
    Lokshtanov, Daniel
    Panolan, Fahad
    Saurabh, Saket
    ACM TRANSACTIONS ON ALGORITHMS, 2020, 16 (01)
  • [28] RAP: Scalable RPCA for Low-rank Matrix Recovery
    Peng, Chong
    Kang, Zhao
    Yang, Ming
    Cheng, Qiang
    CIKM'16: PROCEEDINGS OF THE 2016 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2016, : 2113 - 2118
  • [29] Scalable algorithms for locally low-rank matrix modeling
    Gu, Qilong
    Trzasko, Joshua D.
    Banerjee, Arindam
    KNOWLEDGE AND INFORMATION SYSTEMS, 2019, 61 (03) : 1457 - 1484
  • [30] Scalable algorithms for locally low-rank matrix modeling
    Qilong Gu
    Joshua D. Trzasko
    Arindam Banerjee
    Knowledge and Information Systems, 2019, 61 : 1457 - 1484