Large-scale Online Kernel Learning with Random Feature Reparameterization

被引:0
|
作者
Tu Dinh Nguyen [1 ]
Le, Trung [1 ]
Bui, Hung [2 ]
Phung, Dinh [1 ]
机构
[1] Adobe Syst Inc, Adobe Syst, San Jose, CA 95110 USA
[2] Deakin Univ, Ctr Pattern Recognit & Data Analyt, Geelong, Vic, Australia
基金
澳大利亚研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A typical online kernel learning method faces two fundamental issues: the complexity in dealing with a huge number of observed data points (a.k.a the curse of kernelization) and the difficulty in learning kernel parameters, which often assumed to be fixed. Random Fourier feature is a recent and effective approach to address the former by approximating the shift-invariant kernel function via Bocher's theorem, and allows the model to be maintained directly in the random feature space with a fixed dimension, hence the model size remains constant w.r.t. data size. We further introduce in this paper the reparameterized random feature (RRF), a random feature framework for large-scale online kernel learning to address both aforementioned challenges. Our initial intuition comes from the so-called 'reparameterization trick' [Kingma and Welling, 2014] to lift the source of randomness of Fourier components to another space which can be independently sampled, so that stochastic gradient of the kernel parameters can be analytically derived. We develop a well-founded underlying theory for our method, including a general way to reparameterize the kernel, and a new tighter error bound on the approximation quality. This view further inspires a direct application of stochastic gradient descent for updating our model under an online learning setting. We then conducted extensive experiments on several large-scale datasets where we demonstrate that our work achieves state-of-the-art performance in both learning efficacy and efficiency.
引用
收藏
页码:2543 / 2549
页数:7
相关论文
共 50 条
  • [1] Robust large-scale online kernel learning
    Lei Chen
    Jiaming Zhang
    Hanwen Ning
    [J]. Neural Computing and Applications, 2022, 34 : 15053 - 15073
  • [2] Robust large-scale online kernel learning
    Chen, Lei
    Zhang, Jiaming
    Ning, Hanwen
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (17): : 15053 - 15073
  • [3] Online Adaptive Kernel Learning with Random Features for Large-scale Nonlinear Classification
    Chen, Yingying
    Yang, Xiaowei
    [J]. PATTERN RECOGNITION, 2022, 131
  • [4] Large Scale Online Kernel Learning
    Lu, Jing
    Hoi, Steven C. H.
    Wang, Jialei
    Zhao, Peilin
    Liu, Zhi-Yong
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [5] LARGE-SCALE RANDOM FEATURES FOR KERNEL REGRESSION
    Laparra, Valero
    Gonzalez, Diego Marcos
    Tuia, Devis
    Camps-Valls, Gustau
    [J]. 2015 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2015, : 17 - 20
  • [6] Cost-Sensitive Online Adaptive Kernel Learning for Large-Scale Imbalanced Classification
    Chen, Yingying
    Hong, Zijie
    Yang, Xiaowei
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (10) : 10554 - 10568
  • [7] Multiple feature kernel hashing for large-scale visual search
    Liu, Xianglong
    He, Junfeng
    Lang, Bo
    [J]. PATTERN RECOGNITION, 2014, 47 (02) : 748 - 757
  • [8] Concurrent Learning of Large-Scale Random Forests
    Bostrom, Henrik
    [J]. ELEVENTH SCANDINAVIAN CONFERENCE ON ARTIFICIAL INTELLIGENCE (SCAI 2011), 2011, 227 : 20 - 29
  • [9] Large-scale online learning of implied volatilities
    Kim, Tae-Kyoung
    Kim, Hyun-Gyoon
    Huh, Jeonggyu
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 203
  • [10] Robust feature learning for online discriminative tracking without large-scale pre-training
    Jun Zhang
    Bineng Zhong
    Pengfei Wang
    Cheng Wang
    Jixiang Du
    [J]. Frontiers of Computer Science, 2018, 12 : 1160 - 1172