Double Diffusion Maps and their Latent Harmonics for scientific computations in latent space

被引:10
|
作者
Evangelou, Nikolaos [1 ]
Dietrich, Felix [2 ]
Chiavazzo, Eliodoro [3 ]
Lehmberg, Daniel [2 ]
Meila, Marina [4 ]
Kevrekidis, Ioannis G. [1 ,5 ]
机构
[1] Johns Hopkins Univ, Chem & Biomol Engn, Baltimore, MD 21218 USA
[2] Tech Univ Munich, Dept Informat, D-80333 Munich, Germany
[3] Polytech Univ Turin, Dept Energy, I-10129 Turin, Italy
[4] Tech Univ Washington, Dept Stat, Seattle, WA 98195 USA
[5] 3400 North Charles St, Baltimore, MD 21218 USA
关键词
Manifold learning; Dynamical systems; Scientific computing; Diffusion Maps; INERTIAL MANIFOLDS; MODEL-REDUCTION; EQUATION-FREE; BEHAVIOR; TIME;
D O I
10.1016/j.jcp.2023.112072
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We introduce a data-driven approach to building reduced dynamical models through manifold learning; the reduced latent space is discovered using Diffusion Maps (a manifold learning technique) on time series data. A second round of Diffusion Maps on those latent coordinates allows the approximation of the reduced dynamical models. This second round enables mapping the latent space coordinates back to the full ambient space (what is called lifting); it also enables the approximation of full state functions of interest in terms of the reduced coordinates. In our work, we develop and test three different reduced numerical simulation methodologies, either through pre-tabulation in the latent space and integration on the fly or by going back and forth between the ambient space and the latent space. The data-driven latent space simulation results, based on the three different approaches, are validated through (a) the latent space observation of the full simulation through the Nystrom Extension formula, or through (b) lifting the reduced trajectory back to the full ambient space, via Latent Harmonics. Latent space modeling often involves additional regularization to favor certain properties of the space over others, and the mapping back to the ambient space is then constructed mostly independently from these properties; here, we use the same data-driven approach to construct the latent space and then map back to the ambient space.(c) 2023 Published by Elsevier Inc.
引用
收藏
页数:19
相关论文
共 50 条
  • [31] Set Prediction in the Latent Space
    Preechakul, Konpat
    Piansaddhayanon, Chawan
    Naowarat, Burin
    Khandhawit, Tirasan
    Sriswasdi, Sira
    Chuangsuwanich, Ekapol
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [32] Evolutionary Planning in Latent Space
    Olesen, Thor V. A. N.
    Nguyen, Dennis T. T.
    Palm, Rasmus B.
    Risi, Sebastian
    APPLICATIONS OF EVOLUTIONARY COMPUTATION, EVOAPPLICATIONS 2021, 2021, 12694 : 522 - 536
  • [33] The latent structure of global scientific development
    Miao, Lili
    Murray, Dakota
    Jung, Woo-Sung
    Lariviere, Vincent
    Sugimoto, Cassidy R.
    Ahn, Yong-Yeol
    NATURE HUMAN BEHAVIOUR, 2022, 6 (09) : 1206 - +
  • [34] The latent structure of global scientific development
    Lili Miao
    Dakota Murray
    Woo-Sung Jung
    Vincent Larivière
    Cassidy R. Sugimoto
    Yong-Yeol Ahn
    Nature Human Behaviour, 2022, 6 : 1206 - 1217
  • [35] Goal Recognition in Latent Space
    Amado, Leonardo
    Pereira, Ramon Fraga
    Aires, Joao
    Magnaguagno, Mauricio
    Granada, Roger
    Meneguzzi, Felipe
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [36] Latent Space Bayesian Optimization With Latent Data Augmentation for Enhanced Exploration
    Boyar, Onur
    Takeuchi, Ichiro
    NEURAL COMPUTATION, 2024, 36 (11) : 2446 - 2478
  • [37] Associating Latent Representations With Cognitive Maps via Hyperspherical Space for Neural Population Spikes
    Huang, Yicong
    Yu, Zhu Liang
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2022, 30 : 2886 - 2895
  • [38] On the Semantic Latent Space of Diffusion-Based Text-to-Speech Models
    Varshaysky-Hassid, Miri
    Hirsch, Roy
    Cohen, Regev
    Golany, Tomer
    Freedman, Daniel
    Rivlin, Ehud
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 246 - 255
  • [39] Diffusion Model in Normal Gathering Latent Space for Time Series Anomaly Detection
    Han, Jiashu
    Feng, Shanshan
    Zhou, Min
    Zhang, Xinyu
    Ong, Yew Soon
    Li, Xutao
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT III, ECML PKDD 2024, 2024, 14943 : 284 - 300
  • [40] Everything is There in Latent Space: Attribute Editing and Attribute Style Manipulation by StyleGAN Latent Space Exploration
    Parihar, Rishubh
    Dhiman, Ankit
    Karmali, Tejan
    Babu, R. Venkatesh
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 1828 - 1836