Rethinking statistical learning theory: learning using statistical invariants

被引:34
|
作者
Vapnik, Vladimir [1 ,2 ]
Izmailov, Rauf [3 ]
机构
[1] Columbia Univ, New York, NY USA
[2] Royal Holloway Univ London, Egham TW20 0EX, Surrey, England
[3] Perspecta Labs, Basking Ridge, NJ 07920 USA
关键词
Intelligent teacher; Privileged information; Support vector machine; Neural network; Classification; Learning theory; Regression; Conditional probability; Kernel function; III-Posed problem; Reproducing Kernel Hilbert space; Weak convergence;
D O I
10.1007/s10994-018-5742-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces a new learning paradigm, called Learning Using Statistical Invariants (LUSI), which is different from the classical one. In a classical paradigm, the learning machine constructs a classification rule that minimizes the probability of expected error; it is data-driven model of learning. In the LUSI paradigm, in order to construct the desired classification function, a learning machine computes statistical invariants that are specific for the problem, and then minimizes the expected error in a way that preserves these invariants; it is thus both data- and invariant-driven learning. From a mathematical point of view, methods of the classical paradigm employ mechanisms of strong convergence of approximations to the desired function, whereas methods of the new paradigm employ both strong and weak convergence mechanisms. This can significantly increase the rate of convergence.
引用
收藏
页码:381 / 423
页数:43
相关论文
共 50 条
  • [1] Rethinking statistical learning theory: learning using statistical invariants
    Vladimir Vapnik
    Rauf Izmailov
    [J]. Machine Learning, 2019, 108 : 381 - 423
  • [2] Complete Statistical Theory of Learning (Learning Using Statistical Invariants)
    Vapnik, Vladimir
    Izmailov, Rauf
    [J]. CONFORMAL AND PROBABILISTIC PREDICTION AND APPLICATIONS, VOL 128, 2020, 128 : 4 - 40
  • [3] Two birational invariants in statistical learning theory
    Watanabe, Sumio
    [J]. SINGULARITIES IN GEOMETRY AND TOPOLOGY: STRASBOURG 2009, 2012, 20 : 249 - 268
  • [4] Learning using granularity statistical invariants for classification
    Zhu, Ting-Ting
    Li, Chun-Na
    Liu, Tian
    Shao, Yuan-Hai
    [J]. APPLIED INTELLIGENCE, 2024, 54 (08) : 6667 - 6681
  • [5] PROBABILITY LEARNING IN STATISTICAL LEARNING THEORY
    FEICHTINGER, G
    [J]. METRIKA, 1971, 18 (01) : 35 - 55
  • [6] Motion estimation using statistical learning theory
    Wechsler, H
    Duric, Z
    Li, FY
    Cherkassky, V
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2004, 26 (04) : 466 - 478
  • [7] A statistical approach for learning invariants: Application to image color correction and learning invariants to illumination
    Bascle, B.
    Bernier, O.
    Lemaire, V.
    [J]. NEURAL INFORMATION PROCESSING, PT 2, PROCEEDINGS, 2006, 4233 : 294 - 303
  • [8] Rethinking Statistical Analysis of Associative Learning in an Olfactometer
    Busquet, Nicolas
    Restrepo, Diego
    [J]. CHEMICAL SENSES, 2009, 34 (07) : A95 - A96
  • [9] TOWARD A STATISTICAL THEORY OF LEARNING
    ESTES, WK
    [J]. PSYCHOLOGICAL REVIEW, 1950, 57 (02) : 94 - 107
  • [10] Statistical learning theory: a tutorial
    Kulkarni, Sanjeev R.
    Harman, Gilbert
    [J]. WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2011, 3 (06): : 543 - 556