Domain Invariant Transfer Kernel Learning

被引:166
|
作者
Long, Mingsheng [1 ,2 ]
Wang, Jianmin [1 ,3 ]
Sun, Jiaguang [1 ,3 ]
Yu, Philip S. [4 ]
机构
[1] Tsinghua Univ, Sch Software, Beijing 100084, Peoples R China
[2] Tsinghua Univ, Dept Comp Sci, Beijing 100084, Peoples R China
[3] Tsinghua Univ, Tsinghua Natl Lab Informat Sci & Technol, Beijing 100084, Peoples R China
[4] Univ Illinois, Dept Comp Sci, Chicago, IL 60607 USA
基金
美国国家科学基金会;
关键词
Transfer learning; kernel learning; Nystrom method; text mining; image classification; video recognition; REGULARIZATION; FRAMEWORK;
D O I
10.1109/TKDE.2014.2373376
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain transfer learning generalizes a learning model across training data and testing data with different distributions. A general principle to tackle this problem is reducing the distribution difference between training data and testing data such that the generalization error can be bounded. Current methods typically model the sample distributions in input feature space, which depends on nonlinear feature mapping to embody the distribution discrepancy. However, this nonlinear feature space may not be optimal for the kernel-based learning machines. To this end, we propose a transfer kernel learning (TKL) approach to learn a domain-invariant kernel by directly matching source and target distributions in the reproducing kernel Hilbert space (RKHS). Specifically, we design a family of spectral kernels by extrapolating target eigensystem on source samples with Mercer's theorem. The spectral kernel minimizing the approximation error to the ground truth kernel is selected to construct domain-invariant kernel machines. Comprehensive experimental evidence on a large number of text categorization, image classification, and video event recognition datasets verifies the effectiveness and efficiency of the proposed TKL approach over several state-of-the-art methods.
引用
收藏
页码:1519 / 1532
页数:14
相关论文
共 50 条
  • [1] Domain Transfer Multiple Kernel Learning
    Duan, Lixin
    Tsang, Ivor W.
    Xu, Dong
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2012, 34 (03) : 465 - 479
  • [2] Kernel-Based Domain-Invariant Feature Selection in Hyperspectral Images for Transfer Learning
    Persello, Claudio
    Bruzzone, Lorenzo
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2016, 54 (05): : 2615 - 2626
  • [3] Domain Adaptation Transfer Learning by Kernel Representation Adaptation
    Chen, Xiaoyi
    Lengelle, Regis
    [J]. PATTERN RECOGNITION APPLICATIONS AND METHODS, 2018, 10857 : 45 - 61
  • [4] Cross-Domain Kernel Induction for Transfer Learning
    Chang, Wei-Cheng
    Wu, Yuexin
    Liu, Hanxiao
    Yang, Yiming
    [J]. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1763 - 1769
  • [5] On Learning Domain-Invariant Representations for Transfer Learning with Multiple Sources
    Trung Phung
    Trung Le
    Long Vuong
    Toan Tran
    Anh Tran
    Bui, Hung
    Dinh Phung
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [6] SIMPLIFIED DOMAIN TRANSFER MULTIPLE KERNEL LEARNING FOR LANGUAGE RECOGNITION
    Xu, Jiaming
    Liu, Jia
    Xia, Shanhong
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 6866 - 6869
  • [7] Learning Invariant Representations with Kernel Warping
    Ma, Yingyi
    Ganapathiraman, Vignesh
    Zhang, Xinhua
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [8] Transfer learning of coverage functions via invariant properties in the fourier domain
    Kuo-Shih Tseng
    [J]. Autonomous Robots, 2021, 45 : 519 - 542
  • [9] Transfer learning of coverage functions via invariant properties in the fourier domain
    Tseng, Kuo-Shih
    [J]. AUTONOMOUS ROBOTS, 2021, 45 (04) : 519 - 542
  • [10] Rearrangement Invariant Optimal Domain for Monotone Kernel Operators
    Delgado, Olvido
    [J]. VECTOR MEASURES, INTEGRATION AND RELATED TOPICS, 2010, 201 : 149 - 158