Noise-tolerant distribution-free learning of general geometric concepts

被引:10
|
作者
Bshouty, NH [1 ]
Goldman, SA
Mathias, HD
Suri, S
Tamaki, H
机构
[1] Technion Israel Inst Technol, Dept Comp Sci, IL-32000 Haifa, Israel
[2] Washington Univ, Dept Comp Sci, St Louis, MO 63130 USA
[3] Ohio State Univ, Dept Comp & Informat Sci, Columbus, OH 43210 USA
[4] IBM Japan Ltd, Tokyo Res Lab, Yamamoto 242, Japan
[5] Univ Calgary, Calgary, AB, Canada
关键词
computational learning; geometric concepts;
D O I
10.1145/290179.290184
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We present an efficient algorithm for PAC-learning a very general class of geometric concepts over R-d for fixed d. More specifically, let T be any set of s halfspaces. Let x = (x(1),...,x(d)) be an arbitrary point in R-d. With each t is an element of T we associate a boolean indicator function I-t(x) which is 1 if and only if x is in the halfspace t. The concept class, C-s(d), that we study consists of all concepts formed by any Boolean function over I-t1, ..., I-ts for t(i) is an element of T. This class is much more general than any geometric concept class known to be PAC-learnable. Our results can be extended easily to learn efficiently any Boolean combination of a polynomial number of concepts selected from any concept class C over R-d given that the VC-dimension of C has dependence only on d and there is a polynomial time algorithm to determine if there is a concept from C consistent with a given set of labeled examples. We also present a statistical query Version of our algorithm that can tolerate random classification noise. Finally we present a generalization of the standard epsilon-net result of Haussler and Welzl [1987] and apply it to give an alternative noise-tolerant algorithm for d = 2 based on geometric subdivisions.
引用
收藏
页码:863 / 890
页数:28
相关论文
共 50 条
  • [1] Noise-tolerant parallel learning of geometric concepts
    Bshouty, NH
    Goldman, SA
    Mathias, HD
    INFORMATION AND COMPUTATION, 1998, 147 (01) : 89 - 110
  • [2] EFFICIENT DISTRIBUTION-FREE LEARNING OF PROBABILISTIC CONCEPTS
    KEARNS, MJ
    SCHAPIRE, RE
    JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1994, 48 (03) : 464 - 497
  • [4] On the sample complexity of noise-tolerant learning
    Department of Computer Science, Dartmouth College, Hanover, NH 03755, United States
    不详
    Inf. Process. Lett., 4 (189-195):
  • [5] On the sample complexity of noise-tolerant learning
    Aslam, JA
    Decatur, SE
    INFORMATION PROCESSING LETTERS, 1996, 57 (04) : 189 - 195
  • [6] Efficient distribution-free population learning of simple concepts
    Atsuyoshi Nakamura
    Jun-ichi Takeuchi
    Naoki Abe
    Annals of Mathematics and Artificial Intelligence, 1998, 23 : 53 - 82
  • [7] Efficient distribution-free population learning of simple concepts
    Nakamura, A
    Takeuchi, J
    Abe, N
    ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE, 1998, 23 (1-2) : 53 - 82
  • [8] Overhead-free Noise-tolerant Federated Learning: A New Baseline
    Lin, Shiyi
    Zhai, Deming
    Zhang, Feilong
    Jiang, Junjun
    Liu, Xianming
    Ji, Xiangyang
    MACHINE INTELLIGENCE RESEARCH, 2024, 21 (03) : 526 - 537
  • [9] Agreement or Disagreement in Noise-tolerant Mutual Learning?
    Liu, Jiarun
    Jiang, Daguang
    Yang, Yukun
    Li, Ruirui
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 4801 - 4807
  • [10] Deep learning for noise-tolerant RDFS reasoning
    Makni, Bassem
    Hendler, James
    SEMANTIC WEB, 2019, 10 (05) : 823 - 862