Rethinking statistical learning theory: learning using statistical invariants

被引:34
|
作者
Vapnik, Vladimir [1 ,2 ]
Izmailov, Rauf [3 ]
机构
[1] Columbia Univ, New York, NY USA
[2] Royal Holloway Univ London, Egham TW20 0EX, Surrey, England
[3] Perspecta Labs, Basking Ridge, NJ 07920 USA
关键词
Intelligent teacher; Privileged information; Support vector machine; Neural network; Classification; Learning theory; Regression; Conditional probability; Kernel function; III-Posed problem; Reproducing Kernel Hilbert space; Weak convergence;
D O I
10.1007/s10994-018-5742-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces a new learning paradigm, called Learning Using Statistical Invariants (LUSI), which is different from the classical one. In a classical paradigm, the learning machine constructs a classification rule that minimizes the probability of expected error; it is data-driven model of learning. In the LUSI paradigm, in order to construct the desired classification function, a learning machine computes statistical invariants that are specific for the problem, and then minimizes the expected error in a way that preserves these invariants; it is thus both data- and invariant-driven learning. From a mathematical point of view, methods of the classical paradigm employ mechanisms of strong convergence of approximations to the desired function, whereas methods of the new paradigm employ both strong and weak convergence mechanisms. This can significantly increase the rate of convergence.
引用
收藏
页码:381 / 423
页数:43
相关论文
共 50 条
  • [31] A few notes on statistical learning theory
    Mendelson, S
    [J]. ADVANCED LECTURES ON MACHINE LEARNING, 2002, 2600 : 1 - 40
  • [32] Statistical Asymptotic Theory of Active Learning
    Takafumi Kanamori
    [J]. Annals of the Institute of Statistical Mathematics, 2002, 54 : 459 - 475
  • [33] A statistical learning theory approach of bloat
    Gelly, Sylvain
    Teytaud, Olivier
    Bredeche, Nicolas
    Schoenauer, Marc
    [J]. GECCO 2005: Genetic and Evolutionary Computation Conference, Vols 1 and 2, 2005, : 1783 - 1784
  • [34] AN AUTOMATIC MODEL IN STATISTICAL LEARNING THEORY
    FEICHTIN.G
    [J]. KYBERNETIK, 1970, 6 (06): : 237 - &
  • [35] Toward a theory of embodied statistical learning
    Burfoot, Daniel
    Lungarella, Max
    Kuniyoshi, Yasuo
    [J]. FROM ANIMALS TO ANIMATS 10, PROCEEDINGS, 2008, 5040 : 270 - +
  • [36] On Adaptive Estimators in Statistical Learning Theory
    Konyagin, S. V.
    Livshits, E. D.
    [J]. PROCEEDINGS OF THE STEKLOV INSTITUTE OF MATHEMATICS, 2008, 260 (01) : 185 - 193
  • [37] Consumer purchasing behavior extraction using statistical learning theory
    Zuo, Yi
    Ali, A. B. M. Shawkat
    Yada, Katsutoshi
    [J]. KNOWLEDGE-BASED AND INTELLIGENT INFORMATION & ENGINEERING SYSTEMS 18TH ANNUAL CONFERENCE, KES-2014, 2014, 35 : 1464 - 1473
  • [38] RETHINKING ON APPLICATION OF STATISTICAL THEORY
    NANDI, HK
    [J]. CALCUTTA STATISTICAL ASSOCIATION BULLETIN, 1968, 17 (65): : 1 - &
  • [39] Asymptotic learning curve and renormalizable condition in statistical learning theory
    Watanabe, Sumio
    [J]. INTERNATIONAL WORKSHOP ON STATISTICAL-MECHANICAL INFORMATICS 2010 (IW-SMI 2010), 2010, 233
  • [40] Learning to Reconstruct: Statistical Learning Theory and Encrypted Database Attacks
    Grubbs, Paul
    Lacharite, Marie-Sarah
    Minaud, Brice
    Paterson, Kenneth G.
    [J]. 2019 IEEE SYMPOSIUM ON SECURITY AND PRIVACY (SP 2019), 2019, : 1067 - 1083