Learning low-rank Mercer kernels with fast-decaying spectrum

被引:4
|
作者
Pan, Binbin [2 ]
Lai, Jianhuang [1 ]
Yuen, Pong C. [3 ]
机构
[1] Sun Yat Sen Univ, Sch Informat Sci & Technol, Guangzhou 510275, Guangdong, Peoples R China
[2] Sun Yat Sen Univ, Sch Math & Computat Sci, Guangzhou 510275, Guangdong, Peoples R China
[3] Hong Kong Baptist Univ, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
关键词
Low-rank kernel; Fast-decaying spectrum; Spectrum of Gram matrices; COMPONENT ANALYSIS;
D O I
10.1016/j.neucom.2011.04.021
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank representations have received a lot of interest in the application of kernel-based methods. However, these methods made an assumption that the spectrum of the Gaussian or polynomial kernels decays rapidly. This is not always true and its violation may result in performance degradation. In this paper, we propose an effective technique for learning low-rank Mercer kernels (LMK) with fast-decaying spectrum. What distinguishes our kernels from other classical kernels (Gaussian and polynomial kernels) is that the proposed always yields low-rank Gram matrices whose spectrum decays rapidly, no matter what distribution the data are. Furthermore, the LMK can control the decay rate. Thus, our kernels can prevent performance degradation while using the low-rank approximations. Our algorithm has favorable in scalability-it is linear in the number of data points and quadratic in the rank of the Gram matrix. Empirical results demonstrate that the proposed method learns fast-decaying spectrum and significantly improves the performance. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:3028 / 3035
页数:8
相关论文
共 50 条
  • [41] Deep Low-Rank Coding for Transfer Learning
    Ding, Zhengming
    Shao, Ming
    Fu, Yun
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 3453 - 3459
  • [42] EFFICIENT LEARNING OF DICTIONARIES WITH LOW-RANK ATOMS
    Ravishankar, Saiprasad
    Moore, Brian E.
    Nadakuditi, Raj Rao
    Fessler, Jeffrey A.
    2016 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2016, : 222 - 226
  • [43] Low-Rank Representation of Reinforcement Learning Policies
    Mazoure B.
    Doan T.
    Li T.
    Makarenkov V.
    Pineau J.
    Precup D.
    Rabuseau G.
    Journal of Artificial Intelligence Research, 2022, 75 : 597 - 636
  • [44] Learning Low-Rank Graph With Enhanced Supervision
    Liu, Hui
    Jia, Yuheng
    Hou, Junhui
    Zhang, Qingfu
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (04) : 2501 - 2506
  • [45] Fast Algorithms for Recovering a Corrupted Low-Rank Matrix
    Ganesh, Arvind
    Lin, Zhouchen
    Wright, John
    Wu, Leqin
    Chen, Minming
    Ma, Yi
    2009 3RD IEEE INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP 2009), 2009, : 213 - +
  • [46] Learning Low-Rank Representations for Model Compression
    Zhu, Zezhou
    Dong, Yuan
    Zhao, Zhong
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [47] DECENTRALIZED LEARNING IN THE PRESENCE OF LOW-RANK NOISE
    Nassif, Roula
    Bordignon, Virginia
    Vlaski, Stefan
    Sayed, Ali H.
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 5667 - 5671
  • [48] Low-Rank Representation of Reinforcement Learning Policies
    Mazoure, Bogdan
    Doan, Thang
    Li, Tianyu
    Makarenkov, Vladimir
    Pineau, Joelle
    Precup, Doina
    Rabuseau, Guillaume
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2022, 75 : 597 - 636
  • [49] Subspace Learning Based Low-Rank Representation
    Tang, Kewei
    Liu, Xiaodong
    Su, Zhixun
    Jiang, Wei
    Dong, Jiangxin
    COMPUTER VISION - ACCV 2016, PT I, 2017, 10111 : 416 - 431
  • [50] Learning-Based Low-Rank Approximations
    Indyk, Piotr
    Vakilian, Ali
    Yuan, Yang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32