Large-Scale Kernel-Based Feature Extraction via Low-Rank Subspace Tracking on a Budget

被引:10
|
作者
Sheikholeslami, Fatemeh [1 ,2 ]
Berberidis, Dimitris [1 ,2 ]
Giannakis, Georgios B. [1 ,2 ]
机构
[1] Univ Minnesota, Dept Elect & Comp Engn, Minneapolis, MN 55455 USA
[2] Univ Minnesota, Digital Technol Ctr, Minneapolis, MN 55455 USA
关键词
Online nonlinear feature extraction; kernel methods; classification; regression; budgeted learning; nonlinear subspace tracking; COMPONENT ANALYSIS; SAMPLE COMPLEXITY; NYSTROM METHOD; MATRIX; PERCEPTRON; SPARSITY; ONLINE;
D O I
10.1109/TSP.2018.2802446
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Kernel-based methods enjoy powerful generalization capabilities in learning a variety of pattern recognition tasks. When such methods are provided with sufficient training data, broadly applicable classes of nonlinear functions can be approximated with desired accuracy. Nevertheless, inherent to the nonparametric nature of kernel-based estimators are computational and memory requirements that become prohibitive with large-scale datasets. In response to this formidable challenge, this paper puts forward a low-rank, kernel-based, feature extraction approach that is particularly tailored for online operation. A novel generative model is introduced to approximate high-dimensional (possibly infinite) features via a low-rank nonlinear subspace, the learning of which lends itself to a kernel function approximation. Offline and online solvers are developed for the subspace learning task, along with affordable versions, in which the number of stored data vectors is confined to a predefined budget. Analytical results provide performance bounds on how well the kernel matrix as well as kernel-based classification and regression tasks can be approximated by leveraging budgeted online subspace learning and feature extraction schemes. Tests on synthetic and real datasets demonstrate and benchmark the efficiency of the proposed method for dynamic nonlinear subspace tracking as well as online classification and regressions tasks.
引用
收藏
页码:1967 / 1981
页数:15
相关论文
共 50 条
  • [1] Kernel-based Low-rank Feature Extraction on a Budget for Big Data Streams
    Sheikholeslami, Fatemeh
    Berberidis, Dimitris
    Giannakis, Georgios B.
    [J]. 2015 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2015, : 928 - 932
  • [2] Fast Large-Scale Hyperspectral Image Denoising via Noniterative Low-Rank Subspace Representation
    Chen, Yong
    Zeng, Jinshan
    He, Wei
    Zhao, Xi-Le
    Jiang, Tai-Xiang
    Huang, Qing
    [J]. IEEE Transactions on Geoscience and Remote Sensing, 2024, 62
  • [3] Low-Rank Kernel Matrix Factorization for Large-Scale Evolutionary Clustering
    Wang, Lijun
    Rege, Manjeet
    Dong, Ming
    Ding, Yongsheng
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2012, 24 (06) : 1036 - 1050
  • [4] Low-Rank Kernel-Based Semisupervised Discriminant Analysis
    Zu, Baokai
    Xia, Kewen
    Dai, Shuidong
    Aslam, Nelofar
    [J]. APPLIED COMPUTATIONAL INTELLIGENCE AND SOFT COMPUTING, 2016, 2016
  • [5] Scalable Kernel-based Learning via Low-rank Approximation of Lifted Data
    Sheikholeslami, Fatemeh
    Giannakis, Georgios B.
    [J]. 2017 55TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2017, : 596 - 603
  • [6] SUBSPACE CLUSTERING AND FEATURE EXTRACTION BASED ON LATENT SPARSE LOW-RANK REPRESENTATION
    Zhao, Li-Na
    Ma, Fang
    Yang, Hong-Wei
    [J]. PROCEEDINGS OF 2019 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), 2019, : 95 - 100
  • [7] Latent Low-Rank Representation for Subspace Segmentation and Feature Extraction
    Liu, Guangcan
    Yan, Shuicheng
    [J]. 2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2011, : 1615 - 1622
  • [8] Feature extraction using low-rank approximations of the kernel matrix
    Teixeira, A. R.
    Tome, A. M.
    Lang, E. W.
    [J]. IMAGE ANALYSIS AND RECOGNITION, PROCEEDINGS, 2008, 5112 : 404 - +
  • [9] Kernel-based low-rank tensorized multiview spectral clustering
    Yu, Xiao
    Liu, Hui
    Wu, Yan
    Ruan, Huaijun
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2021, 36 (02) : 757 - 777
  • [10] Low-rank approximation of large-scale matrices via randomized methods
    Hatamirad, Sarvenaz
    Pedram, Mir Mohsen
    [J]. JOURNAL OF SUPERCOMPUTING, 2018, 74 (02): : 830 - 844