Learning Stationary Time Series using Gaussian Processes with Nonparametric Kernels

被引:0
|
作者
Tobar, Felipe [1 ]
Bui, Thang D. [2 ]
Turner, Richard E. [2 ]
机构
[1] Univ Chile, Ctr Math Modeling, Santiago, Chile
[2] Univ Cambridge, Dept Engn, Cambridge, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce the Gaussian Process Convolution Model (GPCM), a two-stage non-parametric generative procedure to model stationary signals as the convolution between a continuous-time white-noise process and a continuous-time linear filter drawn from Gaussian process. The GPCM is a continuous-time nonparametric-window moving average process and, conditionally, is itself a Gaussian process with a nonparametric kernel defined in a probabilistic fashion. The generative model can be equivalently considered in the frequency domain, where the power spectral density of the signal is specified using a Gaussian process. One of the main contributions of the paper is to develop a novel variational free-energy approach based on inter-domain inducing variables that efficiently learns the continuous-time linear filter and infers the driving white-noise process. In turn, this scheme provides closed-form probabilistic estimates of the covariance kernel and the noise-free signal both in denoising and prediction scenarios. Additionally, the variational inference procedure provides closed-form expressions for the approximate posterior of the spectral density given the observed data, leading to new Bayesian nonparametric approaches to spectrum estimation. The proposed GPCM is validated using synthetic and real-world signals.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Learning Nonparametric Volterra Kernels with Gaussian Processes
    Ross, Magnus
    Smith, Michael T.
    Alvarez, Mauricio A.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] Stationary Gaussian Markov processes as limits of stationary autoregressive time series
    Ernst, Philip A.
    Brown, Lawrence D.
    Shepp, Larry
    Wolpert, Robert L.
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2017, 155 : 180 - 186
  • [3] A Nonparametric Model for Stationary Time Series
    Antoniano-Villalobos, Isadora
    Walker, Stephen G.
    [J]. JOURNAL OF TIME SERIES ANALYSIS, 2016, 37 (01) : 126 - 142
  • [4] Variable Selection for Nonparametric Learning with Power Series Kernels
    Matsui, Kota
    Kumagai, Wataru
    Kanamori, Kenta
    Nishikimi, Mitsuaki
    Kanamori, Takafumi
    [J]. NEURAL COMPUTATION, 2019, 31 (08) : 1718 - 1750
  • [5] Nonparametric approach for non-Gaussian vector stationary processes
    Taniguchi, M
    Puri, ML
    Kondo, M
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 1996, 56 (02) : 259 - 283
  • [6] Nonparametric inference for ergodic, stationary time series
    Morvai, G
    Yakowitz, S
    Gyorfi, L
    [J]. ANNALS OF STATISTICS, 1996, 24 (01): : 370 - 379
  • [7] NONPARAMETRIC REGRESSION FOR LOCALLY STATIONARY TIME SERIES
    Vogt, Michael
    [J]. ANNALS OF STATISTICS, 2012, 40 (05): : 2601 - 2633
  • [8] Graph kernels and Gaussian processes for relational reinforcement learning
    Kurt Driessens
    Jan Ramon
    Thomas Gärtner
    [J]. Machine Learning, 2006, 64 : 91 - 119
  • [9] Graph kernels and Gaussian processes for relational reinforcement learning
    Gärtner, T
    Driessens, K
    Ramon, J
    [J]. INDUCTIVE LOGIC PROGRAMMING, PROCEEDINGS, 2003, 2835 : 146 - 163
  • [10] Graph kernels and Gaussian processes for relational reinforcement learning
    Driessens, Kurt
    Ramon, Jan
    Gaertner, Thomas
    [J]. MACHINE LEARNING, 2006, 64 (1-3) : 91 - 119