Non-convex low-rank representation combined with rank-one matrix sum for subspace clustering

被引:0
|
作者
Liu, Xiaofang [2 ]
Wang, Jun [1 ]
Cheng, Dansong [1 ]
Shi, Daming [3 ]
Zhang, Yongqiang [1 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci & Technol, 92 West Dazhi St, Harbin, Peoples R China
[2] Harbin Inst Technol, Sch Elect Engn & Automat, 92 West Dazhi St, Harbin, Peoples R China
[3] Shenzhen Univ, Coll Comp & Software, 3688 Nanhai Ave, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
Subspace clustering; Non-convex low-rank representation; Block coordinate descent; Rank-one matrix; DECISION-MAKING; ALGORITHM;
D O I
10.1007/s00500-020-04865-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Exploring the multiple subspace structures of data such as low-rank representation is effective in subspace clustering. Non-convex low-rank representation (NLRR) via matrix factorization is one of the state-of-the-art techniques for subspace clustering. However, NLRR cannot scale to problems with large n (number of samples) as it requires either the inversion of an nxn linear system. To address this issue, we propose a novel approach, NLRR++, which reformulates NLRR as a sum of rank-one components, and apply a column-wise block coordinate descent to update each component iteratively. NLRR++ reduces the time complexity per iteration from O(n3) and the memory complexity from O(n2), where m is the dimensionality and d is the target rank (usually dMUCH LESS-THANmMUCH LESS-THANn). Our experimental results on simulations and real datasets have shown the efficiency and effectiveness of NLRR++. We demonstrate that NLRR++ is not only much faster than NLRR, but also scalable to large datasets such as the ImageNet dataset with 120K samples.
引用
收藏
页码:15317 / 15326
页数:10
相关论文
共 50 条
  • [41] Sparse subspace clustering with low-rank transformation
    Xu, Gang
    Yang, Mei
    Wu, Qiufeng
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (07): : 3141 - 3154
  • [42] Correlation Structured Low-Rank Subspace Clustering
    You, Huamin
    Li, Yubai
    PROCEEDINGS OF 2020 IEEE 4TH INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2020), 2020, : 710 - 714
  • [43] ORTHOGONAL RANK-ONE MATRIX PURSUIT FOR LOW RANK MATRIX COMPLETION
    Wang, Zheng
    Lai, Ming-Jun
    Lu, Zhaosong
    Fan, Wei
    Davulcu, Hasan
    Ye, Jieping
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2015, 37 (01): : A488 - A514
  • [44] Sparse subspace clustering with low-rank transformation
    Gang Xu
    Mei Yang
    Qiufeng Wu
    Neural Computing and Applications, 2019, 31 : 3141 - 3154
  • [45] Low-Rank Subspace Representation for Spectrum Sensing
    Sumarsono, Alex
    7TH IEEE ANNUAL INFORMATION TECHNOLOGY, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE IEEE IEMCON-2016, 2016,
  • [46] Subspace Learning Based Low-Rank Representation
    Tang, Kewei
    Liu, Xiaodong
    Su, Zhixun
    Jiang, Wei
    Dong, Jiangxin
    COMPUTER VISION - ACCV 2016, PT I, 2017, 10111 : 416 - 431
  • [47] A Fast Non-Convex Regularizer for Low Rank Matrix Completion
    Wu, Cho-Ying
    Ding, Jian-Jiun
    2017 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC 2017), 2017, : 247 - 250
  • [48] Coupled low rank representation and subspace clustering
    Stanley Ebhohimhen Abhadiomhen
    ZhiYang Wang
    XiangJun Shen
    Applied Intelligence, 2022, 52 : 530 - 546
  • [49] An Efficient Non-convex Mixture Method for Low-rank Tensor Completion
    Shi Chengfei
    Wan Li
    Huang Zhengdong
    Xiong Tifan
    PROCEEDINGS OF 2018 THE 2ND INTERNATIONAL CONFERENCE ON VIDEO AND IMAGE PROCESSING (ICVIP 2018), 2018, : 112 - 117
  • [50] Image compressed sensing based on non-convex low-rank approximation
    Zhang, Yan
    Guo, Jichang
    Li, Chongyi
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (10) : 12853 - 12869