A Recurrent Latent Variable Model for Supervised Modeling of High-Dimensional Sequential Data

被引:0
|
作者
Christodoulou, Panayiotis [1 ]
Chatzis, Sotirios P. [1 ]
Andreou, Andreas S. [1 ]
机构
[1] Cyprus Univ Technol, Dept EECEI, Limassol, Cyprus
关键词
Recurrent latent variable; amortized variational inference; high-dimensional sequences; predictive modeling; HIDDEN; WORD;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work, we attempt to ameliorate the impact of data sparsity in the context of supervised modeling applications dealing with high-dimensional sequential data. Specifically, we seek to devise a machine learning mechanism capable of extracting subtle and complex underlying temporal dynamics in the observed sequential data, so as to inform the predictive algorithm. To this end, we improve upon systems that utilize deep learning techniques with recurrently connected units; we do so by adopting concepts from the field of Bayesian statistics, namely variational inference. Our proposed approach consists in treating the network recurrent units as stochastic latent variables with a prior distribution imposed over them. On this basis, we proceed to infer corresponding posteriors; these can be used for prediction generation, in a way that accounts for the uncertainty in the available sparse training data. To allow for our approach to easily scale to large real-world datasets, we perform inference under an approximate amortized variational inference (AVI) setup, whereby the learned posteriors are parameterized via (conventional) neural networks. We perform an extensive experimental evaluation of our approach using challenging benchmark datasets, and illustrate its superiority over existing state-of-the-art techniques.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Multi-kernel Gaussian process latent variable regression model for high-dimensional sequential data modeling
    Zhu, Ziqi
    Zhang, Jiayuan
    Zou, Jixin
    Deng, Chunhua
    NEUROCOMPUTING, 2019, 348 : 3 - 15
  • [2] A Recurrent Latent Variable Model for Sequential Data
    Chung, Junyoung
    Kastner, Kyle
    Dinh, Laurent
    Goel, Kratarth
    Courville, Aaron
    Bengio, Yoshua
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [3] Supervised Bayesian latent class models for high-dimensional data
    Desantis, Stacia M.
    Houseman, E. Andres
    Coull, Brent A.
    Nutt, Catherine L.
    Betensky, Rebecca A.
    STATISTICS IN MEDICINE, 2012, 31 (13) : 1342 - 1360
  • [4] A Robust Supervised Variable Selection for Noisy High-Dimensional Data
    Kalina, Jan
    Schlenker, Anna
    BIOMED RESEARCH INTERNATIONAL, 2015, 2015
  • [5] Supervised model-based visualization of high-dimensional data
    Kontkanen, Petri
    Lahtinen, Jussi
    Myllymäki, Petri
    Silander, Tomi
    Tirri, Henry
    Intelligent Data Analysis, 2000, 4 (3-4) : 213 - 227
  • [6] Supervised clustering of high-dimensional data using regularized mixture modeling
    Chang, Wennan
    Wan, Changlin
    Zang, Yong
    Zhang, Chi
    Cao, Sha
    BRIEFINGS IN BIOINFORMATICS, 2021, 22 (04)
  • [7] On generalized latent factor modeling and inference for high-dimensional binomial data
    Ma, Ting Fung
    Wang, Fangfang
    Zhu, Jun
    BIOMETRICS, 2023, 79 (03) : 2311 - 2320
  • [8] Efficient Dynamic Latent Variable Analysis for High-Dimensional Time Series Data
    Dong, Yining
    Liu, Yingxiang
    Qin, S.
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2020, 16 (06) : 4068 - 4076
  • [9] LATENT VARIABLE SYMBOLIC REGRESSION FOR HIGH-DIMENSIONAL INPUTS
    McConaghy, Trent
    GENETIC PROGRAMMING THEORY AND PRACTICE VII, 2010, : 103 - 118
  • [10] Modeling High-Dimensional Data
    Vempala, Santosh S.
    COMMUNICATIONS OF THE ACM, 2012, 55 (02) : 112 - 112