Maximizing bi-mutual information of features for self-supervised deep clustering

被引:0
|
作者
Jiacheng Zhao
Junfen Chen
Xiangjie Meng
Junhai Zhai
机构
[1] Hebei University,Key Laboratory of Machine Learning and Computational Intelligence of Hebei Province, College of Mathematics and Information Science
来源
关键词
Self-supervised clustering; Mutual information maximization; Auxiliary over-clustering; Features representation;
D O I
10.1007/s43674-021-00012-w
中图分类号
学科分类号
摘要
Self-supervised learning based on mutual information makes good use of classification models and label information produced by clustering tasks to train networks parameters, and then updates the downstream clustering assignment with respect to maximizing mutual information between label information. This kind of methods have attracted more and more attention and obtained better progress, but there is still a larger improvement space compared with the methods of supervised learning, especially on the challenge image datasets. To this end, a self-supervised deep clustering method by maximizing mutual information is proposed (bi-MIM-SSC), where deep convolutional network is employed as a feature encoder. The first term is to maximize mutual information between output-feature pairs for importing more semantic meaning to the output features. The second term is to maximize mutual information between an input image and its feature generated by the encoder for keeping the useful information of an original image in latent space as possible. Furthermore, pre-training is carried out to further enhance the representation ability of the encoder, and the auxiliary over-clustering is added in clustering network. The performance of the proposed method bi-MIM-SSC is compared with other clustering methods on the CIFAR10, CIFAR100 and STL10 datasets. Experimental results demonstrate that the proposed bi-MIM-SSC method has better feature representation ability and provide better clustering results.
引用
收藏
相关论文
共 50 条
  • [21] Deep Self-Supervised Attributed Graph Clustering for Social Network Analysis
    Hu Lu
    Haotian Hong
    Xia Geng
    Neural Processing Letters, 56
  • [22] Deep Self-Supervised Graph Attention Convolution Autoencoder for Networks Clustering
    Chen, Chao
    Lu, Hu
    Hong, Haotian
    Wang, Hai
    Wan, Shaohua
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2023, 69 (04) : 974 - 983
  • [23] Self-supervised autoencoders for clustering and classification
    Paraskevi Nousi
    Anastasios Tefas
    Evolving Systems, 2020, 11 : 453 - 466
  • [24] Self-Supervised Embedding for Subspace Clustering
    Zhu, Wenjie
    Peng, Bo
    Chen, Chunchun
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3687 - 3691
  • [25] Self-supervised autoencoders for clustering and classification
    Nousi, Paraskevi
    Tefas, Anastasios
    EVOLVING SYSTEMS, 2020, 11 (03) : 453 - 466
  • [26] Clustering by Maximizing Mutual Information Across Views
    Kien Do
    Truyen Tran
    Venkatesh, Svetha
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9908 - 9918
  • [27] Self-Supervised Discriminative Feature Learning for Deep Multi-View Clustering
    Xu, Jie
    Ren, Yazhou
    Tang, Huayi
    Yang, Zhimeng
    Pan, Lili
    Yang, Yang
    Pu, Xiaorong
    Yu, Philip S.
    He, Lifang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (07) : 7470 - 7482
  • [28] SensorSCAN: Self-supervised learning and deep clustering for fault diagnosis in chemical processes
    Golyadkin, Maksim
    Pozdnyakov, Vitaliy
    Zhukov, Leonid
    Makarov, Ilya
    ARTIFICIAL INTELLIGENCE, 2023, 324
  • [29] Generic network for domain adaptation based on self-supervised learning and deep clustering
    Baffour, Adu Asare
    Qin, Zhen
    Geng, Ji
    Ding, Yi
    Deng, Fuhu
    Qin, Zhiguang
    NEUROCOMPUTING, 2022, 476 : 126 - 136
  • [30] Short Text Clustering with a Deep Multi-embedded Self-supervised Model
    Zhang, Kai
    Lian, Zheng
    Li, Jiangmeng
    Li, Haichang
    Hu, Xiaohui
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 150 - 161