Deep Learning with Kernels through RKHM and the Perron-Frobenius Operator

被引:0
|
作者
Hashimoto, Yuka [1 ,2 ]
Ikeda, Masahiro [2 ,3 ]
Kadri, Hachem [4 ]
机构
[1] NTT Network Serv Syst Labs, Tokyo, Japan
[2] RIKEN AIP, Tokyo, Japan
[3] Keio Univ, Tokyo, Japan
[4] Aix Marseille Univ, CNRS, LIS, Marseille, France
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023) | 2023年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Reproducing kernel Hilbert C*-module (RKHM) is a generalization of reproducing kernel Hilbert space (RKHS) by means of C*-algebra, and the Perron-Frobenius operator is a linear operator related to the composition of functions. Combining these two concepts, we present deep RKHM, a deep learning framework for kernel methods. We derive a new Rademacher generalization bound in this setting and provide a theoretical interpretation of benign overfitting by means of Perron-Frobenius operators. By virtue of C*-algebra, the dependency of the bound on output dimension is milder than existing bounds. We show that C*-algebra is a suitable tool for deep learning with kernels, enabling us to take advantage of the product structure of operators and to provide a clear connection with convolutional neural networks. Our theoretical analysis provides a new lens through which one can design and analyze deep kernel methods.
引用
收藏
页数:20
相关论文
共 50 条