Learning latent representations in high-dimensional state spaces using polynomial manifold constructions

被引:0
|
作者
Geelen, Rudy [1 ]
Balzano, Laura [2 ]
Willcox, Karen [1 ]
机构
[1] Univ Texas Austin, Oden Inst Computat Engn & Sci, Austin, TX 78712 USA
[2] Univ Michigan Ann Arbor, Dept Elect Engn & Comp Sci, Ann Arbor, MI 48109 USA
关键词
STATISTICAL VARIABLES; COMPLEX;
D O I
10.1109/CDC49753.2023.10384209
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We present a novel framework for learning cost-efficient latent representations in problems with high-dimensional state spaces through nonlinear dimension reduction. By enriching linear state approximations with low-order polynomial terms we account for key nonlinear interactions existing in the data thereby reducing the problem's intrinsic dimensionality. Two methods are introduced for learning the representation of such low-dimensional, polynomial manifolds for embedding the data. The manifold parametrization coefficients can be obtained by regression via either a proper orthogonal decomposition or an alternating minimization based approach. Our numerical results focus on the one-dimensional Korteweg-de Vries equation where accounting for nonlinear correlations in the data was found to lower the representation error by up to two orders of magnitude compared to linear dimension reduction techniques.
引用
下载
收藏
页码:4960 / 4965
页数:6
相关论文
共 50 条
  • [31] A state space compression method based on multivariate analysis for reinforcement learning in high-dimensional continuous state spaces
    Satoh, Hideki
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2006, E89A (08): : 2181 - 2191
  • [32] Scaled free-energy based reinforcement learning for robust and efficient learning in high-dimensional state spaces
    Elfwing, Stefan
    Uchibe, Eiji
    Doya, Kenji
    FRONTIERS IN NEUROROBOTICS, 2013, 7
  • [33] EM in high-dimensional spaces
    Draper, BA
    Elliott, DL
    Hayes, J
    Baek, K
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2005, 35 (03): : 571 - 577
  • [34] The mathematics of high-dimensional spaces
    Rogers, D
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 1998, 215 : U524 - U524
  • [35] Polynomial whitening for high-dimensional data
    Jonathan Gillard
    Emily O’Riordan
    Anatoly Zhigljavsky
    Computational Statistics, 2023, 38 : 1427 - 1461
  • [36] Polynomial whitening for high-dimensional data
    Gillard, Jonathan
    O'Riordan, Emily
    Zhigljavsky, Anatoly
    COMPUTATIONAL STATISTICS, 2023, 38 (03) : 1427 - 1461
  • [37] Using deep reinforcement learning to reveal how the brain encodes abstract state-space representations in high-dimensional environments
    Cross, Logan
    Cockburn, Jeff
    Yue, Yisong
    O'Doherty, John P.
    NEURON, 2021, 109 (04) : 724 - 738.e7
  • [38] Learning Attribute Patterns in High-Dimensional Structured Latent Attribute Models
    Gu, Yuqi
    Xu, Gongjun
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20
  • [39] A Scalable Approach to Modeling Nonlinear Structure in Hyperspectral Imagery and Other High-Dimensional Data Using Manifold Coordinate Representations
    Bachmann, Charles M.
    Ainsworth, Thomas L.
    Fusina, Robert A.
    ALGORITHMS AND TECHNOLOGIES FOR MULTISPECTRAL, HYPERSPECTRAL, AND ULTRASPECTRAL IMAGERY XVI, 2010, 7695
  • [40] LEARNING HIGH-DIMENSIONAL DIRECTED ACYCLIC GRAPHS WITH LATENT AND SELECTION VARIABLES
    Colombo, Diego
    Maathuis, Marloes H.
    Kalisch, Markus
    Richardson, Thomas S.
    ANNALS OF STATISTICS, 2012, 40 (01): : 294 - 321