Learning latent representations in high-dimensional state spaces using polynomial manifold constructions

被引:0
|
作者
Geelen, Rudy [1 ]
Balzano, Laura [2 ]
Willcox, Karen [1 ]
机构
[1] Univ Texas Austin, Oden Inst Computat Engn & Sci, Austin, TX 78712 USA
[2] Univ Michigan Ann Arbor, Dept Elect Engn & Comp Sci, Ann Arbor, MI 48109 USA
关键词
STATISTICAL VARIABLES; COMPLEX;
D O I
10.1109/CDC49753.2023.10384209
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We present a novel framework for learning cost-efficient latent representations in problems with high-dimensional state spaces through nonlinear dimension reduction. By enriching linear state approximations with low-order polynomial terms we account for key nonlinear interactions existing in the data thereby reducing the problem's intrinsic dimensionality. Two methods are introduced for learning the representation of such low-dimensional, polynomial manifolds for embedding the data. The manifold parametrization coefficients can be obtained by regression via either a proper orthogonal decomposition or an alternating minimization based approach. Our numerical results focus on the one-dimensional Korteweg-de Vries equation where accounting for nonlinear correlations in the data was found to lower the representation error by up to two orders of magnitude compared to linear dimension reduction techniques.
引用
收藏
页码:4960 / 4965
页数:6
相关论文
共 50 条
  • [1] Towards Learning Abstract Representations for Locomotion Planning in High-dimensional State Spaces
    Klamt, Tobias
    Behnke, Sven
    [J]. 2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 922 - 928
  • [2] Latent Spaces: The High-Dimensional Infosphere
    Lintunen, Erik
    [J]. HALFWAY TO THE FUTURE SYMPOSIUM (HTTF 2019), 2019,
  • [3] Unsupervised Dimensionality Estimation and Manifold Learning in high-dimensional Spaces by Tensor Voting
    Mordohai, Philippos
    Medioni, Gerard
    [J]. 19TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI-05), 2005, : 798 - 803
  • [4] MANIFOLD LEARNING-BASED POLYNOMIAL CHAOS EXPANSIONS FOR HIGH-DIMENSIONAL SURROGATE MODELS
    Kontolati, Katiana
    Loukrezis, Dimitrios
    dos Santos, Ketson R. M.
    Giovanis, Dimitrios G.
    Shields, Michael D.
    [J]. INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION, 2022, 12 (04) : 39 - 64
  • [5] Polynomial Representations of High-Dimensional Observations of Random Processes
    Loskot, Pavel
    [J]. MATHEMATICS, 2021, 9 (02) : 1 - 24
  • [6] MANIFOLD LEARNING WITH HIGH DIMENSIONAL MODEL REPRESENTATIONS
    Taskin, Gulsen
    Camps-Valls, Gustau
    [J]. IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2020, : 1675 - 1678
  • [7] Continuous Control for High-Dimensional State Spaces: An Interactive Learning Approach
    Perez-Dattari, Rodrigo
    Celemin, Carlos
    Ruiz-del-Solar, Javier
    Kober, Jens
    [J]. 2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 7611 - 7617
  • [8] Manifold Learning for Visualizing and Analyzing High-Dimensional Data
    Zhang, Junping
    Huang, Hua
    Wang, Jue
    [J]. IEEE INTELLIGENT SYSTEMS, 2010, 25 (04) : 54 - 61
  • [9] Learning high-dimensional correspondence via manifold learning and local approximation
    Chenping Hou
    Feiping Nie
    Hua Wang
    Dongyun Yi
    Changshui Zhang
    [J]. Neural Computing and Applications, 2014, 24 : 1555 - 1568
  • [10] Learning high-dimensional correspondence via manifold learning and local approximation
    Hou, Chenping
    Nie, Feiping
    Wang, Hua
    Yi, Dongyun
    Zhang, Changshui
    [J]. NEURAL COMPUTING & APPLICATIONS, 2014, 24 (7-8): : 1555 - 1568