Isolation kernel: the X factor in efficient and effective large scale online kernel learning

被引:0
|
作者
Kai Ming Ting
Jonathan R. Wells
Takashi Washio
机构
[1] Nanjing University,National Key Laboratory for Novel Software Technology
[2] Deakin University,School of Information Technology
[3] Osaka University,The Institute of Scientific and Industrial Research
来源
Data Mining and Knowledge Discovery | 2021年 / 35卷
关键词
Data dependent kernel; Online kernel learning; Kernel functional approximation; Large scale data mining;
D O I
暂无
中图分类号
学科分类号
摘要
Large scale online kernel learning aims to build an efficient and scalable kernel-based predictive model incrementally from a sequence of potentially infinite data points. Current state-of-the-art large scale online kernel learning focuses on improving efficiency. Two key approaches to gain efficiency through approximation are (1) limiting the number of support vectors, and (2) using an approximate feature map. They often employ a kernel with a feature map with intractable dimensionality. While these approaches can deal with large scale datasets efficiently, this outcome is achieved by compromising predictive accuracy because of the approximation. We offer an alternative approach that puts the kernel used at the heart of the approach. It focuses on creating a sparse and finite-dimensional feature map of a kernel called Isolation Kernel. Using this new approach, to achieve the above aim of large scale online kernel learning becomes extremely simple—simply use Isolation Kernel instead of a kernel having a feature map with intractable dimensionality. We show that, using Isolation Kernel, large scale online kernel learning can be achieved efficiently without sacrificing accuracy.
引用
收藏
页码:2282 / 2312
页数:30
相关论文
共 50 条
  • [41] Conscience online learning: an efficient approach for robust kernel-based clustering
    Chang-Dong Wang
    Jian-Huang Lai
    Jun-Yong Zhu
    Knowledge and Information Systems, 2012, 31 : 79 - 104
  • [42] Efficient Online Multi-Task Learning via Adaptive Kernel Selection
    Yang, Peng
    Li, Ping
    WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, : 2465 - 2471
  • [43] Conscience online learning: an efficient approach for robust kernel-based clustering
    Wang, Chang-Dong
    Lai, Jian-Huang
    Zhu, Jun-Yong
    KNOWLEDGE AND INFORMATION SYSTEMS, 2012, 31 (01) : 79 - 104
  • [44] ON EFFICIENT LEARNING AND CLASSIFICATION KERNEL METHODS
    Kung, S. Y.
    Wu, Pei-yuan
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 2065 - 2068
  • [45] Efficient Band Approximation of Gram Matrices for Large Scale Kernel Methods on GPUs
    Hussein, Mohamed
    Abd-Almageed, Wael
    PROCEEDINGS OF THE CONFERENCE ON HIGH PERFORMANCE COMPUTING NETWORKING, STORAGE AND ANALYSIS, 2009,
  • [46] An Online Multiple Kernel Parallelizable Learning Scheme
    Ruiz-Moreno, Emilio
    Beferull-Lozano, Baltasar
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 121 - 125
  • [47] Online Mean Kernel learning for object tracking
    School of Technology, Beijing Forestry University, Beijing, China
    不详
    不详
    Int. J. Signal Process. Image Process. Pattern Recogn., 11 (273-282):
  • [48] Online kernel density estimation for interactive learning
    Kristan, M.
    Skocaj, D.
    Leonardis, A.
    IMAGE AND VISION COMPUTING, 2010, 28 (07) : 1106 - 1116
  • [49] Fast kernel classifiers with online and active learning
    Bordes, A
    Ertekin, S
    Weston, J
    Bottou, L
    JOURNAL OF MACHINE LEARNING RESEARCH, 2005, 6 : 1579 - 1619
  • [50] Nonlinear Online Learning - A Kernel SMF Approach
    Chen, Kewei
    Werner, Stefan
    Kuh, Anthony
    Huang, Yih-Fang
    2018 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2018, : 218 - 223