Gaussian Process Neural Additive Models

被引:0
|
作者
Zhang, Wei [1 ]
Barr, Brian [2 ]
Paisley, John [1 ]
机构
[1] Columbia Univ, New York, NY 10025 USA
[2] Capital One, New York, NY USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks have revolutionized many fields, but their black-box nature also occasionally prevents their wider adoption in fields such as healthcare and finance, where interpretable and explainable models are required. The recent development of Neural Additive Models (NAMs) is a significant step in the direction of interpretable deep learning for tabular datasets. In this paper, we propose a new subclass of NAMs that use a single-layer neural network construction of the Gaussian process via random Fourier features, which we call Gaussian Process Neural Additive Models (GP-NAM). GP-NAMs have the advantage of a convex objective function and number of trainable parameters that grows linearly with feature dimensionality. It suffers no loss in performance compared to deeper NAM approaches because GPs are well-suited for learning complex non-parametric univariate functions. We demonstrate the performance of GP-NAM on several tabular datasets, showing that it achieves comparable or better performance in both classification and regression tasks with a large reduction in the number of parameters.
引用
下载
收藏
页码:16865 / 16872
页数:8
相关论文
共 50 条
  • [21] Scalable Gaussian Process Classification With Additive Noise for Non-Gaussian Likelihoods
    Liu, Haitao
    Ong, Yew-Soon
    Yu, Ziwei
    Cai, Jianfei
    Shen, Xiaobo
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (07) : 5842 - 5854
  • [22] A BAYESIAN IMAGE INPAINTING METHOD WITH ADDITIVE GAUSSIAN PROCESS
    Ji Ruirui
    Fan Yihong
    2015 IEEE CHINA SUMMIT & INTERNATIONAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING, 2015, : 254 - 257
  • [23] Treed Gaussian Process Models for Classification
    Broderick, Tamara
    Gramacy, Robert B.
    CLASSIFICATION AS A TOOL FOR RESEARCH, 2010, : 101 - 108
  • [24] Source separation with Gaussian process models
    Park, Sunho
    Choi, Seungjin
    MACHINE LEARNING: ECML 2007, PROCEEDINGS, 2007, 4701 : 262 - +
  • [25] Gaussian process regression with mismatched models
    Sollich, P
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 14, VOLS 1 AND 2, 2002, 14 : 519 - 526
  • [26] Revisiting Gaussian Process Dynamical Models
    Zhao, Jing
    Sun, Shiliang
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 1047 - 1053
  • [27] On the Identifiability and Interpretability of Gaussian Process Models
    Chen, Jiawen
    Mu, Wancen
    Li, Yun
    Li, Didong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [28] Predictive control with Gaussian process models
    Kocijan, J
    Murray-Smith, R
    Rasmussen, CE
    Likar, B
    IEEE REGION 8 EUROCON 2003, VOL A, PROCEEDINGS: COMPUTER AS A TOOL, 2003, : 352 - 356
  • [29] Sparse representation for Gaussian process models
    Csató, L
    Opper, M
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 13, 2001, 13 : 444 - 450
  • [30] Spiked Dirichlet Process Priors for Gaussian Process Models
    Savitsky, Terrance
    Vannucci, Marina
    JOURNAL OF PROBABILITY AND STATISTICS, 2010, 2010