Learning low-rank Mercer kernels with fast-decaying spectrum

被引:4
|
作者
Pan, Binbin [2 ]
Lai, Jianhuang [1 ]
Yuen, Pong C. [3 ]
机构
[1] Sun Yat Sen Univ, Sch Informat Sci & Technol, Guangzhou 510275, Guangdong, Peoples R China
[2] Sun Yat Sen Univ, Sch Math & Computat Sci, Guangzhou 510275, Guangdong, Peoples R China
[3] Hong Kong Baptist Univ, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
关键词
Low-rank kernel; Fast-decaying spectrum; Spectrum of Gram matrices; COMPONENT ANALYSIS;
D O I
10.1016/j.neucom.2011.04.021
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank representations have received a lot of interest in the application of kernel-based methods. However, these methods made an assumption that the spectrum of the Gaussian or polynomial kernels decays rapidly. This is not always true and its violation may result in performance degradation. In this paper, we propose an effective technique for learning low-rank Mercer kernels (LMK) with fast-decaying spectrum. What distinguishes our kernels from other classical kernels (Gaussian and polynomial kernels) is that the proposed always yields low-rank Gram matrices whose spectrum decays rapidly, no matter what distribution the data are. Furthermore, the LMK can control the decay rate. Thus, our kernels can prevent performance degradation while using the low-rank approximations. Our algorithm has favorable in scalability-it is linear in the number of data points and quadratic in the rank of the Gram matrix. Empirical results demonstrate that the proposed method learns fast-decaying spectrum and significantly improves the performance. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:3028 / 3035
页数:8
相关论文
共 50 条
  • [1] Fast Recursive Low-rank Tensor Learning for Regression
    Hou, Ming
    Chaib-draa, Brahim
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1851 - 1857
  • [2] Accurate and fast matrix factorization for low-rank learning
    Godaz, Reza
    Monsefi, Reza
    Toutounian, Faezeh
    Hosseini, Reshad
    JOURNAL OF MATHEMATICAL MODELING, 2022, 10 (02): : 263 - 278
  • [3] Fast Low-Rank Matrix Learning with Nonconvex Regularization
    Yao, Quanming
    Kwok, James T.
    Zhong, Wenliang
    2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2015, : 539 - 548
  • [4] Learning Fast Low-Rank Projection for Image Classification
    Li, Jun
    Kong, Yu
    Zhao, Handong
    Yang, Jian
    Fu, Yun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2016, 25 (10) : 4803 - 4814
  • [5] lp-norm multiple kernel learning with low-rank kernels
    Rakotomamonjy, Alain
    Chanda, Sukalpa
    NEUROCOMPUTING, 2014, 143 : 68 - 79
  • [6] Discriminative Orthonormal Dictionary Learning for Fast Low-Rank Representation
    Dong, Zhen
    Pei, Mingtao
    Jia, Yunde
    NEURAL INFORMATION PROCESSING, PT I, 2015, 9489 : 79 - 89
  • [7] Fast Low-Rank Shared Dictionary Learning for Image Classification
    Vu, Tiep Huu
    Monga, Vishal
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (11) : 5160 - 5175
  • [8] On the Realization of Impulse Invariant Low-Rank Volterra Kernels
    Burt, Phillip M. S.
    de Morais Goulart, Jose Henrique
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1492 - 1496
  • [9] Low-Rank Spectral Learning
    Kulesza, Alex
    Rao, N. Raj
    Singh, Satinder
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 33, 2014, 33 : 522 - 530
  • [10] Fast Low-Rank Subspace Segmentation
    Zhang, Xin
    Sun, Fuchun
    Liu, Guangcan
    Ma, Yi
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2014, 26 (05) : 1293 - 1297