Learning low-rank Mercer kernels with fast-decaying spectrum

被引:4
|
作者
Pan, Binbin [2 ]
Lai, Jianhuang [1 ]
Yuen, Pong C. [3 ]
机构
[1] Sun Yat Sen Univ, Sch Informat Sci & Technol, Guangzhou 510275, Guangdong, Peoples R China
[2] Sun Yat Sen Univ, Sch Math & Computat Sci, Guangzhou 510275, Guangdong, Peoples R China
[3] Hong Kong Baptist Univ, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
关键词
Low-rank kernel; Fast-decaying spectrum; Spectrum of Gram matrices; COMPONENT ANALYSIS;
D O I
10.1016/j.neucom.2011.04.021
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank representations have received a lot of interest in the application of kernel-based methods. However, these methods made an assumption that the spectrum of the Gaussian or polynomial kernels decays rapidly. This is not always true and its violation may result in performance degradation. In this paper, we propose an effective technique for learning low-rank Mercer kernels (LMK) with fast-decaying spectrum. What distinguishes our kernels from other classical kernels (Gaussian and polynomial kernels) is that the proposed always yields low-rank Gram matrices whose spectrum decays rapidly, no matter what distribution the data are. Furthermore, the LMK can control the decay rate. Thus, our kernels can prevent performance degradation while using the low-rank approximations. Our algorithm has favorable in scalability-it is linear in the number of data points and quadratic in the rank of the Gram matrix. Empirical results demonstrate that the proposed method learns fast-decaying spectrum and significantly improves the performance. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:3028 / 3035
页数:8
相关论文
共 50 条
  • [21] Fast optimization algorithm on Riemannian manifolds and its application in low-rank learning
    Chen, Haoran
    Sun, Yanfeng
    Gao, Junbin
    Hu, Yongli
    Yin, Baocai
    NEUROCOMPUTING, 2018, 291 : 59 - 70
  • [22] Fast randomized numerical rank estimation for numerically low-rank matrices
    Meier, Maike
    Nakatsukasa, Yuji
    Linear Algebra and Its Applications, 2024, 686 : 1 - 32
  • [23] Fast randomized numerical rank estimation for numerically low-rank matrices
    Meier, Maike
    Nakatsukasa, Yuji
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2024, 686 : 1 - 32
  • [24] Reconstruction of low-rank aggregation kernels in univariate population balance equations
    Ahrens, Robin
    Le Borne, Sabine
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2021, 47 (03)
  • [25] Reconstruction of low-rank aggregation kernels in univariate population balance equations
    Robin Ahrens
    Sabine Le Borne
    Advances in Computational Mathematics, 2021, 47
  • [26] Online Fast Adaptive Low-Rank Similarity Learning for Cross-Modal Retrieval
    Wu, Yiling
    Wang, Shuhui
    Huang, Qingming
    IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 22 (05) : 1310 - 1322
  • [27] Fast and Memory Optimal Low-Rank Matrix Approximation
    Yun, Se-Young
    Lelarge, Marc
    Proutiere, Alexandre
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [28] Low-Rank Constraints for Fast Inference in Structured Models
    Chiu, Justin T.
    Deng, Yuntian
    Rush, Alexander M.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [29] Fast Gradient Method for Low-Rank Matrix Estimation
    Hongyi Li
    Zhen Peng
    Chengwei Pan
    Di Zhao
    Journal of Scientific Computing, 2023, 96
  • [30] Adaptive sampling and fast low-rank matrix approximation
    Deshpande, Amit
    Vempala, Santosh
    APPROXIMATION, RANDOMIZATION AND COMBINATORIAL OPTIMIZATION: ALGORITHMS AND TECHNIQUES, 2006, 4110 : 292 - 303