Kernel K-Means Sampling for Nystrom Approximation

被引:47
|
作者
He, Li [1 ]
Zhang, Hong [2 ]
机构
[1] Guangdong Univ Technol, Sch Electromech Engn, Guangzhou 510006, Guangdong, Peoples R China
[2] Univ Alberta, Dept Comp Sci, Edmonton, AB T6G 2E8, Canada
基金
中国国家自然科学基金;
关键词
Kernel matrix approximation; Nystrom approximation; kernel k-means; image segmentation; MATRIX;
D O I
10.1109/TIP.2018.2796860
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A fundamental problem in Nystrom-based kernel matrix approximation is the sampling method by which training set is built. In this paper, we suggest to use kernel k-means sampling, which is shown in our works to minimize the upper bound of a matrix approximation error. We first propose a unified kernel matrix approximation framework, which is able to describe most existing Nystrom approximations under many popular kernels, including Gaussian kernel and polynomial kernel. We then show that, the matrix approximation error upper bound, in terms of the Frobenius norm, is equal to the k-means error of data points in kernel space plus a constant. Thus, the k-means centers of data in kernel space, or the kernel k-means centers, are the optimal representative points with respect to the Frobenius norm error upper bound. Experimental results, with both Gaussian kernel and polynomial kernel, on real-world data sets and image segmentation tasks show the superiority of the proposed method over the state-of-the-art methods.
引用
收藏
页码:2108 / 2120
页数:13
相关论文
共 50 条
  • [1] The Nystrom Kernel Conjugate Gradient Algorithm Based on k-Means Sampling
    He, Fuliang
    Xiong, Kui
    Wang, Shiyuan
    [J]. IEEE ACCESS, 2020, 8 : 18716 - 18726
  • [2] Beyond the Nystrom Approximation: Speeding up Spectral Clustering using Uniform Sampling and Weighted Kernel k-means
    Mohan, Mahesh
    Monteleoni, Claire
    [J]. PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2494 - 2500
  • [3] Scalable kernel K-means clustering with nystrom approximation: Relative-error bounds
    Wang, Shusen
    Gittens, Alex
    Mahoney, Michael W.
    [J]. Journal of Machine Learning Research, 2019, 20
  • [4] Kernel Recursive Least Squares Algorithm Based on the Nystrom Method With k-Means Sampling
    Zhang, Tao
    Wang, Shiyuan
    Huang, Xuewei
    Jia, Lei
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2020, 27 (27) : 361 - 365
  • [5] Scalable Kernel K-Means Clustering with Nystrom Approximation: Relative-Error Bounds
    Wang, Shusen
    Gittens, Alex
    Mahoney, Michael W.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20
  • [6] The Nystrom minimum kernel risk-sensitive loss algorithm with k-means sampling
    Zhang, Tao
    Wang, Shiyuan
    Huang, Xuewei
    Wang, Lin
    [J]. JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2020, 357 (14): : 10082 - 10099
  • [7] Approximation of Kernel k-Means for Streaming Data
    Havens, Timothy C.
    [J]. 2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 509 - 512
  • [8] Nystrom Method with Kernel K-means plus plus Samples as Landmarks
    Oglic, Dino
    Gartner, Thomas
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [9] A SAMPLING APPROXIMATION FOR LARGE-SCALE K-MEANS
    Phoungphol, Piyaphol
    [J]. ICAART: PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE, VOL 1, 2012, : 324 - 327
  • [10] Kernel Penalized K-means: A feature selection method based on Kernel K-means
    Maldonado, Sebastian
    Carrizosa, Emilio
    Weber, Richard
    [J]. INFORMATION SCIENCES, 2015, 322 : 150 - 160