A tutorial on kernel methods for categorization

被引:54
|
作者
Jaekel, Frank [1 ,2 ,3 ]
Schoelkopf, Bernhard [1 ]
Wichmann, Felix A. [1 ,2 ,3 ]
机构
[1] Max Planck Inst Biol Cybernet, D-72076 Tubingen, Germany
[2] Tech Univ Berlin, Fak 4, D-10587 Berlin, Germany
[3] Bernstein Ctr Computat Neurosci, D-10115 Berlin, Germany
关键词
kernel; similarity; machine learning; generalization; categorization; SIMILARITY; EXEMPLAR; MODELS; CLASSIFICATION;
D O I
10.1016/j.jmp.2007.06.002
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
The abilities to learn and to categorize are fundamental for cognitive systems, be it animals or machines, and therefore have attracted attention from engineers and psychologists alike. Modern machine learning methods and psychological models of categorization are remarkably similar, partly because these two fields share a common history in artificial neural networks and reinforcement learning. However, machine learning is now an independent and mature field that has moved beyond psychologically or neurally inspired algorithms towards providing foundations for a theory of learning that is rooted in statistics and functional analysis. Much of this research is potentially interesting for psychological theories of learning and categorization but also hardly accessible for psychologists. Here, we provide a tutorial introduction to a popular class of machine learning tools, called kernel methods. These methods are closely related to perceptrons, radial-basis-function neural networks and exemplar theories of categorization. Recent theoretical advances in machine learning are closely tied to the idea that the similarity of patterns can be encapsulated in a positive definite kernel. Such a positive definite kernel can define a reproducing kernel Hilbert space which allows one to use powerful tools from functional analysis for the analysis of learning algorithms. We give basic explanations of some key concepts-the so-called kernel trick, the representer theorem and regularization-which may open up the possibility that insights from machine learning can feed back into psychology. (C) 2007 Elsevier Inc. All rights reserved.
引用
收藏
页码:343 / 358
页数:16
相关论文
共 50 条
  • [41] EFFICIENT KERNEL DESCRIPTOR FOR IMAGE CATEGORIZATION VIA PIVOTS SELECTION
    Xie, Bojun
    Liu, Yi
    Zhang, Hui
    Yu, Jian
    2013 20TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP 2013), 2013, : 3479 - 3483
  • [42] Eigenvalues Ratio for Kernel Selection of Kernel Methods
    Liu, Yong
    Liao, Shizhong
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2814 - 2820
  • [43] A Tutorial on Auditory Attention Identification Methods
    Alickovic, Emina
    Lunner, Thomas
    Gustafsson, Fredrik
    Ljung, Lennart
    FRONTIERS IN NEUROSCIENCE, 2019, 13
  • [44] GLOBALLY CONVERGENT HOMOTOPY METHODS - A TUTORIAL
    WATSON, LT
    APPLIED MATHEMATICS AND COMPUTATION, 1989, 31 : 369 - 396
  • [45] Electron spin echo methods: A tutorial
    Britt, RD
    PARAMAGNETIC RESONANCE OF METALLOBIOMOLECULES, 2003, 858 : 16 - 54
  • [46] Bayesian Methods in Interaction Design (Tutorial)
    Williamson, John
    Oulasvirta, Antti
    Kristensson, Per Ola
    PROCEEDINGS OF THE 25TH INTERNATIONAL CONFERENCE ON INTELLIGENT USER INTERFACES COMPANION (IUI'20), 2020, : 11 - 12
  • [47] Tutorial on Statistical Methods for Validation Tests
    Bonnini, Stefano
    Pia, Maria Grazia
    Ronchieri, Elisabetta
    2017 IEEE NUCLEAR SCIENCE SYMPOSIUM AND MEDICAL IMAGING CONFERENCE (NSS/MIC), 2017,
  • [48] Introduction to Kernel Methods
    Jaekel, F.
    PERCEPTION, 2013, 42 : 3 - 3
  • [49] Bayesian kernel methods
    Smola, AJ
    Schölkopf, B
    ADVANCED LECTURES ON MACHINE LEARNING, 2002, 2600 : 65 - 117
  • [50] Kernel methods for clustering
    Camastra, Francesco
    NEURAL NETS, 2006, 3931 : 1 - 9