Diversity Sampling is an Implicit Regularization for Kernel Methods

被引:8
|
作者
Fanuel, Michael [1 ]
Schreurs, Joachim [1 ]
Suykens, Johan A. K. [1 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn ESAT, STADIUS Ctr Dynam Syst, Signal Proc & Data Analyt, B-3001 Leuven, Belgium
来源
基金
欧洲研究理事会;
关键词
kernel; Nystrom; determinantal point process; regularization; NYSTROM;
D O I
10.1137/20M1320031
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Kernel methods have achieved very good performance on large scale regression and classification problems by using the Nystrom method and preconditioning techniques. The Nystrom approximation--based on a subset of landmarks-gives a low rank approximation of the kernel matrix, and is known to provide a form of implicit regularization. We further elaborate on the impact of sampling diverse landmarks for constructing the Nystrom approximation in supervised as well as unsupervised kernel methods. By using Determinantal Point Processes (DPP) for sampling, we obtain additional theoretical results concerning the interplay between diversity and regularization. Empirically, we demonstrate the advantages of training kernel methods based on subsets made of diverse points. In particular, if the dataset has a dense bulk and a sparser tail, we show that Nystrom kernel regression with diverse landmarks increases the accuracy of the regression in sparser regions of the dataset, with respect to a uniform landmark sampling. A greedy heuristic is also proposed to select diverse samples of significant size within large datasets when exact DPP sampling is not practically feasible.
引用
收藏
页码:280 / 297
页数:18
相关论文
共 50 条
  • [21] Implicit Kernel Attention
    Song, Kyungwoo
    Jung, Yohan
    Kim, Dongjun
    Moon, Il-Chul
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 9713 - 9721
  • [22] ELM based Multiple Kernel k-means with Diversity-Induced Regularization
    Zhao, Yang
    Dou, Yong
    Liu, Xinwang
    Li, Teng
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 2699 - 2705
  • [23] Implicit Kernel Learning
    Li, Chun-Liang
    Chang, Wei-Cheng
    Mroueh, Youssef
    Yang, Yiming
    Poczos, Barnabas
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [24] Implicit regularization and renormalization of QCD
    Sampaio, Marcos
    Scarpelli, A. P. Baeta
    Ottoni, J. E.
    Nemes, M. C.
    INTERNATIONAL JOURNAL OF THEORETICAL PHYSICS, 2006, 45 (02) : 449 - 470
  • [25] Implicit Regularization in Tensor Factorization
    Razin, Noam
    Maman, Asaf
    Cohen, Nadav
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [26] Implicit Regularization in Matrix Factorization
    Gunasekar, Suriya
    Woodworth, Blake
    Bhojanapalli, Srinadh
    Neyshabur, Behnam
    Srebro, Nathan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [27] Implicit Regularization in Matrix Factorization
    Gunasekar, Suriya
    Woodworth, Blake
    Bhojanapalli, Srinadh
    Neyshabur, Behnam
    Srebro, Nathan
    2018 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2018,
  • [28] Implicit Regularization and Renormalization of QCD
    Marcos Sampaio
    A. P. Baêta Scarpelli
    J. E. Ottoni
    M. C. Nemes
    International Journal of Theoretical Physics, 2006, 45 : 436 - 457
  • [29] Product kernel regularization networks
    Kudova, P
    Samalova, T
    ADAPTIVE AND NATURAL COMPUTING ALGORITHMS, 2005, : 433 - 436
  • [30] KERNEL REGULARIZATION FOR IMAGE RESTORATION
    Unni, V. S.
    Chaudhury, Kunal N.
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 943 - 947