Robust and Efficient Subspace Segmentation via Least Squares Regression

被引:482
|
作者
Lu, Can-Yi [1 ,2 ]
Min, Hai [1 ]
Zhao, Zhong-Qiu [3 ]
Zhu, Lin [1 ]
Huang, De-Shuang [4 ]
Yan, Shuicheng [2 ]
机构
[1] Univ Sci & Technol China, Dept Automat, Hefei 230026, Peoples R China
[2] Natl Univ Singapore, Dept Elect & Comp Engn, Singapore, Singapore
[3] Hefei Univ Technol, Sch Comp & Informat, Hefei, Peoples R China
[4] Tongji Univ, Sch Elect & Informat Engn, Shanghai, Peoples R China
来源
基金
新加坡国家研究基金会;
关键词
FACE RECOGNITION;
D O I
10.1007/978-3-642-33786-4_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper studies the subspace segmentation problem which aims to segment data drawn from a union of multiple linear subspaces. Recent works by using sparse representation, low rank representation and their extensions attract much attention. If the subspaces from which the data drawn are independent or orthogonal, they are able to obtain a block diagonal affinity matrix, which usually leads to a correct segmentation. The main differences among them are their objective functions. We theoretically show that if the objective function satisfies some conditions, and the data are sufficiently drawn from independent subspaces, the obtained affinity matrix is always block diagonal. Furthermore, the data sampling can be insufficient if the subspaces are orthogonal. Some existing methods are all special cases. Then we present the Least Squares Regression (LSR) method for subspace segmentation. It takes advantage of data correlation, which is common in real data. LSR encourages a grouping effect which tends to group highly correlated data together. Experimental results on the Hopkins 155 database and Extended Yale Database B show that our method significantly outperforms state-of-the-art methods. Beyond segmentation accuracy, all experiments demonstrate that LSR is much more efficient.
引用
收藏
页码:347 / 360
页数:14
相关论文
共 50 条
  • [21] A NEW ROBUST PARTIAL LEAST SQUARES REGRESSION METHOD BASED ON A ROBUST AND AN EFFICIENT ADAPTIVE REWEIGHTED ESTIMATOR OF COVARIANCE
    Polat, Esra
    Gunay, Suleyman
    [J]. REVSTAT-STATISTICAL JOURNAL, 2019, 17 (04) : 449 - 474
  • [22] Fast Subspace Approximation Via Greedy Least-Squares
    M. A. Iwen
    Felix Krahmer
    [J]. Constructive Approximation, 2015, 42 : 281 - 301
  • [23] Robust Spectral Subspace Clustering Based on Least Square Regression
    Wu, Zongze
    Yin, Ming
    Zhou, Yajing
    Fang, Xiaozhao
    Xie, Shengli
    [J]. NEURAL PROCESSING LETTERS, 2018, 48 (03) : 1359 - 1372
  • [24] Fast Subspace Approximation Via Greedy Least-Squares
    Iwen, M. A.
    Krahmer, Felix
    [J]. CONSTRUCTIVE APPROXIMATION, 2015, 42 (02) : 281 - 301
  • [25] The Comparison of Robust Partial Least Squares Regression with Robust Principal Component Regression on a Real Data
    Polat, Esra
    Gunay, Suleyman
    [J]. 11TH INTERNATIONAL CONFERENCE OF NUMERICAL ANALYSIS AND APPLIED MATHEMATICS 2013, PTS 1 AND 2 (ICNAAM 2013), 2013, 1558 : 1458 - 1461
  • [26] Robust battery lifetime prediction with noisy measurements via total-least-squares regression
    Lu, Ting
    Zhai, Xiaoang
    Chen, Sihui
    Liu, Yang
    Wan, Jiayu
    Liu, Guohua
    Li, Xin
    [J]. INTEGRATION-THE VLSI JOURNAL, 2024, 96
  • [27] Robustness properties of a robust partial least squares regression method
    Vanden Branden, K
    Hubert, M
    [J]. ANALYTICA CHIMICA ACTA, 2004, 515 (01) : 229 - 241
  • [28] ROBUST REGRESSION WITH A DISTRIBUTED INTERCEPT USING LEAST MEDIAN OF SQUARES
    ROUSSEEUW, PJ
    WAGNER, J
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 1994, 17 (01) : 65 - 75
  • [29] Level Robust Methods Based on the Least Squares Regression Estimator
    Ng, Marie
    Wilcox, Rand R.
    [J]. JOURNAL OF MODERN APPLIED STATISTICAL METHODS, 2009, 8 (02) : 384 - 395
  • [30] The minimum sum of absolute errors regression: A robust alternative to the least squares regression
    Narula, SC
    Saldiva, PHN
    Andre, CDS
    Elian, SN
    Ferreira, AF
    Capelozzi, V
    [J]. STATISTICS IN MEDICINE, 1999, 18 (11) : 1401 - 1417