How Deep Are Deep Gaussian Processes?

被引:0
|
作者
Dunlop, Matthew M. [1 ]
Girolami, Mark A. [2 ,3 ]
Stuart, Andrew M. [1 ]
Teckentrup, Aretha L. [3 ,4 ]
机构
[1] CALTECH, Comp & Math Sci, Pasadena, CA 91125 USA
[2] Imperial Coll London, Dept Math, London SW7 2AZ, England
[3] Alan Turing Inst, 96 Euston Rd, London NW1 2DB, England
[4] Univ Edinburgh, Sch Math, Edinburgh EH9 3FD, Midlothian, Scotland
基金
英国工程与自然科学研究理事会; 美国国家科学基金会;
关键词
deep learning; deep Gaussian processes; deep kernels; CALIBRATION; INFERENCE; FIELDS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent research has shown the potential utility of deep Gaussian processes. These deep structures are probability distributions, designed through hierarchical construction, which are conditionally Gaussian. In this paper, the current published body of work is placed in a common framework and, through recursion, several classes of deep Gaussian processes are defined. The resulting samples generated from a deep Gaussian process have a Markovian structure with respect to the depth parameter, and the effective depth of the resulting process is interpreted in terms of the ergodicity, or non-ergodicity, of the resulting Markov chain. For the classes of deep Gaussian processes introduced, we provide results concerning their ergodicity and hence their effective depth. We also demonstrate how these processes may be used for inference; in particular we show how a Metropolis-within-Gibbs construction across the levels of the hierarchy can be used to derive sampling tools which are robust to the level of resolution used to represent the functions on a computer. For illustration, we consider the effect of ergodicity in some simple numerical examples.
引用
收藏
页数:46
相关论文
共 50 条
  • [1] How deep are deep Gaussian processes?
    Dunlop, Matthew M.
    Girolami, Mark A.
    Stuart, Andrew M.
    Teckentrup, Aretha L.
    Journal of Machine Learning Research, 2018, 19 : 1 - 46
  • [2] Thin and Deep Gaussian Processes
    de Souza, Daniel Augusto
    Nikitin, Alexander
    John, S. T.
    Ross, Magnus
    Alvarez, Mauricio A.
    Deisenroth, Marc Peter
    Gomes, Joao P. P.
    Mesquita, Diego
    Mattos, Cesar Lincoln C.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [3] Deep Convolutional Gaussian Processes
    Blomqvist, Kenneth
    Kaski, Samuel
    Heinonen, Markus
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 11907 : 582 - 597
  • [4] Deep Neural Networks as Point Estimates for Deep Gaussian Processes
    Dutordoir, Vincent
    Hensman, James
    van der wilk, Mark
    Ek, Carl Henrik
    Ghahramani, Zoubin
    Durrande, Nicolas
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [5] Compositional uncertainty in deep Gaussian processes
    Ustyuzhaninov, Ivan
    Kazlauskaite, Ieva
    Kaiser, Markus
    Bodin, Erik
    Campbell, Neill D. F.
    Ek, Carl Henrik
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI 2020), 2020, 124 : 480 - 489
  • [6] Deep Gaussian processes: Theory and applications
    Djuric, Petar M.
    2018 14TH SYMPOSIUM ON NEURAL NETWORKS AND APPLICATIONS (NEUREL), 2018,
  • [7] Interpretable Deep Gaussian Processes with Moments
    Lu, Chi-Ken
    Yang, Scott Cheng-Hsin
    Hao, Xiaoran
    Shafto, Patrick
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 613 - 622
  • [8] Deep Structured Mixtures of Gaussian Processes
    Trapp, Martin
    Peharz, Robert
    Pernkopf, Franz
    Rasmussen, Carl Edward
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [9] A sparse expansion for deep Gaussian processes
    Ding, Liang
    Tuo, Rui
    Shahrampour, Shahin
    IISE TRANSACTIONS, 2024, 56 (05) : 559 - 572
  • [10] Calibrating Deep Convolutional Gaussian Processes
    Tran, G-L
    Bonilla, E., V
    Cunningham, J. P.
    Michiardi, P.
    Filippone, M.
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89