Double Diffusion Maps and their Latent Harmonics for scientific computations in latent space

被引:10
|
作者
Evangelou, Nikolaos [1 ]
Dietrich, Felix [2 ]
Chiavazzo, Eliodoro [3 ]
Lehmberg, Daniel [2 ]
Meila, Marina [4 ]
Kevrekidis, Ioannis G. [1 ,5 ]
机构
[1] Johns Hopkins Univ, Chem & Biomol Engn, Baltimore, MD 21218 USA
[2] Tech Univ Munich, Dept Informat, D-80333 Munich, Germany
[3] Polytech Univ Turin, Dept Energy, I-10129 Turin, Italy
[4] Tech Univ Washington, Dept Stat, Seattle, WA 98195 USA
[5] 3400 North Charles St, Baltimore, MD 21218 USA
关键词
Manifold learning; Dynamical systems; Scientific computing; Diffusion Maps; INERTIAL MANIFOLDS; MODEL-REDUCTION; EQUATION-FREE; BEHAVIOR; TIME;
D O I
10.1016/j.jcp.2023.112072
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We introduce a data-driven approach to building reduced dynamical models through manifold learning; the reduced latent space is discovered using Diffusion Maps (a manifold learning technique) on time series data. A second round of Diffusion Maps on those latent coordinates allows the approximation of the reduced dynamical models. This second round enables mapping the latent space coordinates back to the full ambient space (what is called lifting); it also enables the approximation of full state functions of interest in terms of the reduced coordinates. In our work, we develop and test three different reduced numerical simulation methodologies, either through pre-tabulation in the latent space and integration on the fly or by going back and forth between the ambient space and the latent space. The data-driven latent space simulation results, based on the three different approaches, are validated through (a) the latent space observation of the full simulation through the Nystrom Extension formula, or through (b) lifting the reduced trajectory back to the full ambient space, via Latent Harmonics. Latent space modeling often involves additional regularization to favor certain properties of the space over others, and the mapping back to the ambient space is then constructed mostly independently from these properties; here, we use the same data-driven approach to construct the latent space and then map back to the ambient space.(c) 2023 Published by Elsevier Inc.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] SOCIAL SPACE DIFFUSION: APPLICATIONS OF A LATENT SPACE MODEL TO DIFFUSION WITH UNCERTAIN TIES
    Fisher, Jacob C.
    SOCIOLOGICAL METHODOLOGY, VOL 49, 2019, 49 : 258 - 294
  • [2] Video Probabilistic Diffusion Models in Projected Latent Space
    Yu, Sihyun
    Sohn, Kihyuk
    Kim, Subin
    Shin, Jinwoo
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 18456 - 18466
  • [3] Into the latent space
    Nature Machine Intelligence, 2020, 2 : 151 - 151
  • [4] Into the latent space
    不详
    NATURE MACHINE INTELLIGENCE, 2020, 2 (03) : 151 - 151
  • [5] Resilience for Stencil Computations with Latent Errors
    Fang, Aiman
    Cavelan, Aurelien
    Robert, Yves
    Chien, Andrew A.
    2017 46TH INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING (ICPP), 2017, : 581 - 590
  • [6] Latent generative landscapes as maps of functional diversity in protein sequence space
    Ziegler, Cheyenne
    Martin, Jonathan
    Sinner, Claude
    Morcos, Faruck
    NATURE COMMUNICATIONS, 2023, 14 (01)
  • [7] Latent generative landscapes as maps of functional diversity in protein sequence space
    Cheyenne Ziegler
    Jonathan Martin
    Claude Sinner
    Faruck Morcos
    Nature Communications, 14
  • [8] Branched Latent Neural Maps
    Salvador, Matteo
    Marsden, Alison Lesley
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2024, 418
  • [9] Denoising Diffusion Models on Model-Based Latent Space
    Scribano, Carmelo
    Pezzi, Danilo
    Franchini, Giorgia
    Prato, Marco
    ALGORITHMS, 2023, 16 (11)
  • [10] Executing your Commands via Motion Diffusion in Latent Space
    Chen, Xin
    Jiang, Biao
    Liu, Wen
    Huang, Zilong
    Fu, Bin
    Chen, Tao
    Yu, Gang
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 18000 - 18010