Metrics for Deep Generative Models

被引:0
|
作者
Chen, Nutan [1 ]
Klushyn, Alexej [1 ]
Kurle, Richard [1 ]
Jiang, Xueyan [1 ]
Bayer, Justin [1 ]
van der Smagt, Patrick [1 ]
机构
[1] Volkswagen Grp, Data Lab, AI Res, Munich, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural samplers such as variational autoencoders (VAEs) or generative adversarial networks (GANs) approximate distributions by transforming samples from a simple random source-the latent space-to samples from a more complex distribution represented by a dataset. While the manifold hypothesis implies that a dataset contains large regions of low density, the training criterions of VAEs and GANs will make the latent space densely covered. Consequently points that are separated by low-density regions in observation space will be pushed together in latent space, making stationary distances poor proxies for similarity. We transfer ideas from Riemannian geometry to this setting, letting the distance between two points be the shortest path on a Riemannian manifold induced by the transformation. The method yields a principled distance measure, provides a tool for visual inspection of deep generative models, and an alternative to linear interpolation in latent space. In addition, it can be applied for robot movement generalization using previously learned skills. The method is evaluated on a synthetic dataset with known ground truth; on a simulated robot arm dataset; on human motion capture data; and on a generative model of handwritten digits.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Beyond Statistical Similarity: Rethinking Metrics for Deep Generative Models in Engineering Design
    Regenwetter, Lyle
    Srivastava, Akash
    Gutfreund, Dan
    Ahmed, Faez
    [J]. COMPUTER-AIDED DESIGN, 2023, 165
  • [2] Diversity in Deep Generative Models and Generative AI
    Turinici, Gabriel
    [J]. MACHINE LEARNING, OPTIMIZATION, AND DATA SCIENCE, LOD 2023, PT II, 2024, 14506 : 84 - 93
  • [3] Evaluation Metrics for Generative Models: An Empirical Study
    Betzalel, Eyal
    Penso, Coby
    Fetaya, Ethan
    [J]. MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2024, 6 (03): : 1531 - 1544
  • [4] Learning Deep Generative Models
    Salakhutdinov, Ruslan
    [J]. ANNUAL REVIEW OF STATISTICS AND ITS APPLICATION, VOL 2, 2015, 2 : 361 - 385
  • [5] Deep generative models in DataSHIELD
    Stefan Lenz
    Moritz Hess
    Harald Binder
    [J]. BMC Medical Research Methodology, 21
  • [6] Asymmetric deep generative models
    Partaourides, Harris
    Chatzis, Sotirios P.
    [J]. NEUROCOMPUTING, 2017, 241 : 90 - 96
  • [7] Auxiliary Deep Generative Models
    Maaloe, Lars
    Sonderby, Casper Kaae
    Sonderby, Soren Kaae
    Winther, Ole
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [8] Deep Generative Models: Survey
    Oussidi, Achraf
    Elhassouny, Azeddine
    [J]. 2018 INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS AND COMPUTER VISION (ISCV2018), 2018,
  • [9] An Overview of Deep Generative Models
    Xu, Jungang
    Li, Hui
    Zhou, Shilong
    [J]. IETE TECHNICAL REVIEW, 2015, 32 (02) : 131 - 139
  • [10] Deep generative models in DataSHIELD
    Lenz, Stefan
    Hess, Moritz
    Binder, Harald
    [J]. BMC MEDICAL RESEARCH METHODOLOGY, 2021, 21 (01)