Rethinking statistical learning theory: learning using statistical invariants

被引:0
|
作者
Vladimir Vapnik
Rauf Izmailov
机构
[1] Columbia University,Royal Holloway
[2] University of London,undefined
[3] Perspecta Labs,undefined
来源
Machine Learning | 2019年 / 108卷
关键词
Intelligent teacher; Privileged information; Support vector machine; Neural network; Classification; Learning theory; Regression; Conditional probability; Kernel function; Ill-Posed problem; Reproducing Kernel Hilbert space; Weak convergence; 68Q32; 68T05; 68T30; 83C32;
D O I
暂无
中图分类号
学科分类号
摘要
This paper introduces a new learning paradigm, called Learning Using Statistical Invariants (LUSI), which is different from the classical one. In a classical paradigm, the learning machine constructs a classification rule that minimizes the probability of expected error; it is data-driven model of learning. In the LUSI paradigm, in order to construct the desired classification function, a learning machine computes statistical invariants that are specific for the problem, and then minimizes the expected error in a way that preserves these invariants; it is thus both data- and invariant-driven learning. From a mathematical point of view, methods of the classical paradigm employ mechanisms of strong convergence of approximations to the desired function, whereas methods of the new paradigm employ both strong and weak convergence mechanisms. This can significantly increase the rate of convergence.
引用
收藏
页码:381 / 423
页数:42
相关论文
共 50 条
  • [11] Statistical learning theory: a tutorial
    Kulkarni, Sanjeev R.
    Harman, Gilbert
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2011, 3 (06): : 543 - 556
  • [12] Complete Statistical Theory of Learning
    Vapnik, V. N.
    AUTOMATION AND REMOTE CONTROL, 2019, 80 (11) : 1949 - 1975
  • [13] An overview of statistical learning theory
    Vapnik, VN
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (05): : 988 - 999
  • [14] Introduction to statistical learning theory
    Bousquet, O
    Boucheron, S
    Lugosi, G
    ADVANCED LECTURES ON MACHINE LEARNING, 2004, 3176 : 169 - 207
  • [15] Introduction to Statistical Learning Theory
    Bousquet, Olivier
    Boucheron, Stéphane
    Lugosi, Gábor
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2004, 3176 : 169 - 207
  • [16] Statistical Learning Theory: A Primer
    Theodoros Evgeniou
    Massimiliano Pontil
    Tomaso Poggio
    International Journal of Computer Vision, 2000, 38 : 9 - 13
  • [17] Complete Statistical Theory of Learning
    V. N. Vapnik
    Automation and Remote Control, 2019, 80 : 1949 - 1975
  • [18] Statistical learning theory: A primer
    Evgeniou, T
    Pontil, M
    Poggio, T
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2000, 38 (01) : 9 - 13
  • [19] Applying statistical learning theory to deep learning
    Gerbelot, Cedric
    Karagulyan, Avetik
    Karp, Stefani
    Ravichandran, Kavya
    Stern, Menachem
    Srebro, Nathan
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2024, 2024 (10):
  • [20] Statistical learning of language: Theory, validity, and predictions of a statistical learning account of language acquisition
    Erickson, Lucy C.
    Thiessen, Erik D.
    DEVELOPMENTAL REVIEW, 2015, 37 : 66 - 108