Differentiable Bi-Sparse Multi-View Co-Clustering

被引:35
|
作者
Du, Shide [1 ,2 ]
Liu, Zhanghui [1 ,2 ]
Chen, Zhaoliang [1 ,2 ]
Yang, Wenyuan [3 ]
Wang, Shiping [1 ,2 ]
机构
[1] Fuzhou Univ, Coll Math & Comp Sci, Fuzhou 350116, Peoples R China
[2] Fuzhou Univ, Fujian Prov Key Lab Network Comp & Intelligent In, Fuzhou 350116, Peoples R China
[3] Minnan Normal Univ, Fujian Key Lab Granular Comp & Applicat, Zhangzhou 363000, Peoples R China
基金
中国国家自然科学基金;
关键词
Collaboration; Deep learning; multi-view clustering; co-clustering; sparse representation; differentiable blocks;
D O I
10.1109/TSP.2021.3101979
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep multi-view clustering utilizes neural networks to extract the potential peculiarities of complementarity and consistency information among multi-view features. This can obtain a consistent representation that improves clustering performance. Although a multitude of deep multi-view clustering approaches have been proposed, most lack theoretic interpretability while maintaining the advantages of good performance. In this paper, we propose an effective differentiable network with alternating iterative optimization for multi-view co-clustering termed differentiable bi-sparse multi-view co-clustering (DBMC) and an extension named elevated DBMC (EDBMC). The proposed methods are transformed into equivalent deep networks based on the constructed objective loss functions. They have the advantages of strong interpretability of the classical machine learning methods and the superior performance of deep networks. Moreover, DBMC and EDBMC can learn a joint and consistent collaborative representation from multi-source features and guarantee sparsity between multi-view feature space and single-view sample space. Meanwhile, they can be converted into deep differentiable network frameworks with block-wise iterative training. Correspondingly, we design two three-step iterative differentiable networks to resolve resultant optimization problems with theoretically guaranteed convergence. Extensive experiments on six multi-view benchmark datasets demonstrate that the proposed frameworks outperform other state-of-the-art multi-view clustering methods.
引用
收藏
页码:4623 / 4636
页数:14
相关论文
共 50 条
  • [1] Weighted multi-view co-clustering (WMVCC) for sparse data
    Syed Fawad Hussain
    Khadija Khan
    Rashad Jillani
    Applied Intelligence, 2022, 52 : 398 - 416
  • [2] Weighted multi-view co-clustering (WMVCC) for sparse data
    Hussain, Syed Fawad
    Khan, Khadija
    Jillani, Rashad
    APPLIED INTELLIGENCE, 2022, 52 (01) : 398 - 416
  • [3] Co-clustering of multi-view datasets
    Syed Fawad Hussain
    Shariq Bashir
    Knowledge and Information Systems, 2016, 47 : 545 - 570
  • [4] Co-clustering of multi-view datasets
    Hussain, Syed Fawad
    Bashir, Shariq
    KNOWLEDGE AND INFORMATION SYSTEMS, 2016, 47 (03) : 545 - 570
  • [5] Multi-view Sparse Co-clustering via Proximal Alternating Linearized Minimization
    Sun, Jiangwen
    Lu, Jin
    Xu, Tingyang
    Bi, Jinbo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 757 - 766
  • [6] Multi-view co-clustering with multi-similarity
    Zhao, Ling
    Ma, Yunpeng
    Chen, Shanxiong
    Zhou, Jun
    APPLIED INTELLIGENCE, 2023, 53 (13) : 16961 - 16972
  • [7] Multi-view co-clustering with multi-similarity
    Ling Zhao
    Yunpeng Ma
    Shanxiong Chen
    Jun Zhou
    Applied Intelligence, 2023, 53 : 16961 - 16972
  • [8] Co-clustering based classification of multi-view data
    Syed Fawad Hussain
    Mohsin Khan
    Imran Siddiqi
    Applied Intelligence, 2022, 52 : 14756 - 14772
  • [9] Co-clustering of Multi-View Datasets: a Parallelizable Approach
    Bisson, Gilles
    Grimal, Clement
    12TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2012), 2012, : 828 - 833
  • [10] Co-clustering based classification of multi-view data
    Hussain, Syed Fawad
    Khan, Mohsin
    Siddiqi, Imran
    APPLIED INTELLIGENCE, 2022, 52 (13) : 14756 - 14772