Stable and Fast Deep Mutual Information Maximization Based on Wasserstein Distance

被引:0
|
作者
He, Xing [1 ,2 ]
Peng, Changgen [3 ]
Wang, Lin [2 ]
Tan, Weijie [3 ]
Wang, Zifan [4 ]
机构
[1] Guizhou Univ, Coll Comp Sci & Technol, State Key Lab Publ Big Data, Guiyang 550025, Peoples R China
[2] Guizhou Minzu Univ, Guizhou Key Lab Pattern Recognit & Intelligent Sys, Guiyang 550025, Peoples R China
[3] Guizhou Univ, Guizhou Big Data Acad, Guiyang 550025, Peoples R China
[4] Inst Guizhou Aerosp Measuring & Testing Technol, Guiyang 550009, Peoples R China
关键词
machine learning; deep learning; unsupervised learning; encoder network; mutual information estimation; NONLINEAR DIMENSIONALITY REDUCTION; EIGENMAPS;
D O I
10.3390/e25121607
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Deep learning is one of the most exciting and promising techniques in the field of artificial intelligence (AI), which drives AI applications to be more intelligent and comprehensive. However, existing deep learning techniques usually require a large amount of expensive labeled data, which limit the application and development of deep learning techniques, and thus it is imperative to study unsupervised machine learning. The learning of deep representations by mutual information estimation and maximization (Deep InfoMax or DIM) method has achieved unprecedented results in the field of unsupervised learning. However, in the DIM method, to restrict the encoder to learn more normalized feature representations, an adversarial network learning method is used to make the encoder output consistent with a priori positively distributed data. As we know, the model training of the adversarial network learning method is difficult to converge, because there is a logarithmic function in the loss function of the cross-entropy measure, and the gradient of the model parameters is susceptible to the "gradient explosion" or "gradient disappearance" phenomena, which makes the training of the DIM method extremely unstable. In this regard, we propose a Wasserstein distance-based DIM method to solve the stability problem of model training, and our method is called the WDIM. Subsequently, the training stability of the WDIM method and the classification ability of unsupervised learning are verified on the CIFAR10, CIFAR100, and STL10 datasets. The experiments show that our proposed WDIM method is more stable to parameter updates, has faster model convergence, and at the same time, has almost the same accuracy as the DIM method on the classification task of unsupervised learning. Finally, we also propose a reflection of future research for the WDIM method, aiming to provide a research idea and direction for solving the image classification task with unsupervised learning.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Deep node clustering based on mutual information maximization
    Molaei, Soheila
    Bousejin, Nima Ghanbari
    Zare, Hadi
    Jalili, Mahdi
    NEUROCOMPUTING, 2021, 455 : 274 - 282
  • [2] Fast and Accurate Deep Leakage from Gradients Based on Wasserstein Distance
    He, Xing
    Peng, Changgen
    Tan, Weijie
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2023, 2023
  • [3] A Feature Metric Algorithm Combining the Wasserstein Distance and Mutual Information
    Lu, Mengtao
    Yin, Jianfei
    PROCEEDINGS OF THE 2018 IEEE INTERNATIONAL CONFERENCE ON PROGRESS IN INFORMATICS AND COMPUTING (PIC), 2018, : 154 - 157
  • [4] Simple and Stable Internal Representation by Potential Mutual Information Maximization
    Kamimura, Ryotaro
    ENGINEERING APPLICATIONS OF NEURAL NETWORKS, EANN 2016, 2016, 629 : 309 - 316
  • [5] Signal estimation based on mutual information maximization
    Rohde, G. K.
    Nichols, J.
    Bucholtz, F.
    Michalowicz, J. V.
    CONFERENCE RECORD OF THE FORTY-FIRST ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, VOLS 1-5, 2007, : 597 - +
  • [6] A hierarchical clustering based on mutual information maximization
    Aghagolzadeh, M.
    Soltanian-Zadeh, H.
    Araabi, B.
    Aghagolzadeh, A.
    2007 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1-7, 2007, : 277 - +
  • [7] Learning Deep Generative Clustering via Mutual Information Maximization
    Yang, Xiaojiang
    Yan, Junchi
    Cheng, Yu
    Zhang, Yizhe
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 6263 - 6275
  • [8] Variational Deep Embedding Clustering by Augmented Mutual Information Maximization
    Ji, Qiang
    Sun, Yanfeng
    Hu, Yongli
    Yin, Baocai
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 2196 - 2202
  • [9] Medical image segmentation based on mutual information maximization
    Rigau, J
    Feixas, M
    Sbert, M
    Bardera, A
    Boada, I
    MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION - MICCAI 2004, PT 1, PROCEEDINGS, 2004, 3216 : 135 - 142
  • [10] Cross-modal image retrieval with deep mutual information maximization
    Gu, Chunbin
    Bu, Jiajun
    Zhou, Xixi
    Yao, Chengwei
    Ma, Dongfang
    Yu, Zhi
    Yan, Xifeng
    NEUROCOMPUTING, 2022, 496 : 166 - 177