Towards Very Deep Representation Learning for Subspace Clustering

被引:0
|
作者
Li, Yanming [1 ]
Wang, Shiye [1 ]
Li, Changsheng [1 ]
Yuan, Ye [1 ]
Wang, Guoren [1 ]
机构
[1] Beijing Inst Technol, Sch Comp Sci & Technol, Beijing 100081, Peoples R China
关键词
Representation learning; self-supervised learning; subspace clustering; ROBUST;
D O I
10.1109/TKDE.2024.3362984
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep subspace clustering based on the self-expressive layer has attracted increasing attention in recent years. Due to the self-expressive layer, these methods need to load the whole dataset into one batch for learning the self-expressive coefficients. Such a learning strategy puts a great burden on memory, which severely prevents from the usage of deeper network architectures (e.g., ResNet), and becomes a bottleneck for applying to large-scale data. In this paper, we propose a new deep subspace clustering framework, in order to address the above challenges. In contrast to previous approaches taking the weights of a fully connected layer as the self-expressive coefficients, we attempt to obtain the self-expressive coefficients by learning an energy based network in a mini-batch training manner. By this means, it is no longer necessary to load all data into one batch for learning, thus avoiding the above issue. Considering the powerful representation ability of the recently popular self-supervised learning, we leverage self-supervised representation learning to learn the dictionary for representing data. Finally, we propose a joint framework to learn both the self-expressive coefficients and the dictionary simultaneously. Extensive experiments on three publicly available datasets demonstrate the effectiveness of our method.
引用
收藏
页码:3568 / 3579
页数:12
相关论文
共 50 条
  • [1] Multi-Level Representation Learning for Deep Subspace Clustering
    Kheirandishfard, Mohsen
    Zohrizadeh, Fariba
    Kamangar, Farhad
    [J]. 2020 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2020, : 2028 - 2037
  • [2] Learning Idempotent Representation for Subspace Clustering
    Wei, Lai
    Liu, Shiteng
    Zhou, Rigui
    Zhu, Changming
    Liu, Jin
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (03) : 1183 - 1197
  • [3] Unsupervised Deep Learning for Subspace Clustering
    Sekmen, Ali
    Koku, Ahmet Bugra
    Parlaktuna, Mustafa
    Abdul-Malek, Ayad
    Vanamala, Nagendrababu
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2017, : 2089 - 2094
  • [4] Deep Successive Subspace Learning for Data Clustering
    Sadeghi, Mohammadreza
    Armanfard, Narges
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [5] Block diagonal representation learning for robust subspace clustering
    Wang, Lijuan
    Huang, Jiawen
    Yin, Ming
    Cai, Ruichu
    Hao, Zhifeng
    [J]. INFORMATION SCIENCES, 2020, 526 : 54 - 67
  • [6] Learning deep representation for trajectory clustering
    Yao, Di
    Zhang, Chao
    Zhu, Zhihua
    Hu, Qin
    Wang, Zheng
    Huang, Jianhui
    Bi, Jingping
    [J]. EXPERT SYSTEMS, 2018, 35 (02)
  • [7] Joint representation learning for multi-view subspace clustering
    Zhang, Guang-Yu
    Zhou, Yu-Ren
    Wang, Chang-Dong
    Huang, Dong
    He, Xiao-Yu
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2021, 166
  • [8] Fast subspace clustering by learning projective block diagonal representation *
    Xu, Yesong
    Chen, Shuo
    Li, Jun
    Xu, Chunyan
    Yang, Jian
    [J]. PATTERN RECOGNITION, 2023, 135
  • [9] Sparse subspace clustering with jointly learning representation and affinity matrix
    Yin, Ming
    Wu, Zongze
    Zeng, Deyu
    Li, Panshuo
    Xie, Shengli
    [J]. JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2018, 355 (08): : 3795 - 3811
  • [10] Flexible Multi-View Representation Learning for Subspace Clustering
    Li, Ruihuang
    Zhang, Changqing
    Hu, Qinghua
    Zhu, Pengfei
    Wang, Zheng
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 2916 - 2922