Learning joint latent representations based on information maximization

被引:24
|
作者
Ye, Fei [1 ]
Bors, Adrian G. [1 ]
机构
[1] Univ York, Dept Comp Sci, York YO10 5GH, N Yorkshire, England
关键词
Disentangled learning; Variational Autoencoders (VAE); Generative Adversarial Nets (GAN); Representation learning; Mutual Information;
D O I
10.1016/j.ins.2021.03.007
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Learning disentangled and interpretable representations is an important aspect of informa-tion understanding. In this paper, we propose a novel deep learning model representing both discrete and continuous latent variable spaces which can be used in either supervised or unsupervised learning. The proposed model is trained using an optimization function employing the mutual information maximization criterion. For the unsupervised learning setting we define a lower bound to the mutual information between the joint distribution of the latent variables corresponding to the real data and those generated by the model. The maximization of this lower bound during the training induces the learning of disentan-gled and interpretable data representations. Such representations can be used for attribute manipulation and image editing tasks. (c) 2021 Elsevier Inc. All rights reserved.
引用
收藏
页码:216 / 236
页数:21
相关论文
共 50 条
  • [1] INFOVAEGAN : LEARNING JOINT INTERPRETABLE REPRESENTATIONS BY INFORMATION MAXIMIZATION AND MAXIMUM LIKELIHOOD
    Ye, Fei
    Bors, Adrian G.
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 749 - 753
  • [2] Learning Representations by Graphical Mutual Information Estimation and Maximization
    Peng, Zhen
    Luo, Minnan
    Huang, Wenbing
    Li, Jundong
    Zheng, Qinghua
    Sun, Fuchun
    Huang, Junzhou
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (01) : 722 - 737
  • [3] Uncertainty Autoencoders: Learning Compressed Representations via Variational Information Maximization
    Grover, Aditya
    Ermon, Stefano
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [4] Variational Graph Autoencoder with Mutual Information Maximization for Graph Representations Learning
    Li, Dongjie
    Li, Dong
    Lian, Guang
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2022, 36 (09)
  • [5] Joint Learning of Multiple Latent Domains and Deep Representations for Domain Adaptation
    Wu, Xinxiao
    Chen, Jin
    Yu, Feiwu
    Yao, Mingyu
    Luo, Jiebo
    IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (05) : 2676 - 2687
  • [6] Joint maximization of accuracy and information for learning the structure of a Bayesian network classifier
    Dan Halbersberg
    Maydan Wienreb
    Boaz Lerner
    Machine Learning, 2020, 109 : 1039 - 1099
  • [7] Joint maximization of accuracy and information for learning the structure of a Bayesian network classifier
    Halbersberg, Dan
    Wienreb, Maydan
    Lerner, Boaz
    MACHINE LEARNING, 2020, 109 (05) : 1039 - 1099
  • [8] Learning graph representations for influence maximization
    Panagopoulos, George
    Tziortziotis, Nikolaos
    Vazirgiannis, Michalis
    Pang, Jun
    Malliaros, Fragkiskos D.
    SOCIAL NETWORK ANALYSIS AND MINING, 2024, 14 (01)
  • [9] Feature selection based on fuzzy joint mutual information maximization
    Salem, Omar A. M.
    Liu, Feng
    Sherif, Ahmed Sobhy
    Zhang, Wen
    Chen, Xi
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2020, 18 (01) : 305 - 327
  • [10] An information maximization approach to overcomplete and recurrent representations
    Shriki, O
    Sompolinsky, H
    Lee, DD
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 13, 2001, 13 : 612 - 618