Noise-tolerant distribution-free learning of general geometric concepts

被引:10
|
作者
Bshouty, NH [1 ]
Goldman, SA
Mathias, HD
Suri, S
Tamaki, H
机构
[1] Technion Israel Inst Technol, Dept Comp Sci, IL-32000 Haifa, Israel
[2] Washington Univ, Dept Comp Sci, St Louis, MO 63130 USA
[3] Ohio State Univ, Dept Comp & Informat Sci, Columbus, OH 43210 USA
[4] IBM Japan Ltd, Tokyo Res Lab, Yamamoto 242, Japan
[5] Univ Calgary, Calgary, AB, Canada
关键词
computational learning; geometric concepts;
D O I
10.1145/290179.290184
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We present an efficient algorithm for PAC-learning a very general class of geometric concepts over R-d for fixed d. More specifically, let T be any set of s halfspaces. Let x = (x(1),...,x(d)) be an arbitrary point in R-d. With each t is an element of T we associate a boolean indicator function I-t(x) which is 1 if and only if x is in the halfspace t. The concept class, C-s(d), that we study consists of all concepts formed by any Boolean function over I-t1, ..., I-ts for t(i) is an element of T. This class is much more general than any geometric concept class known to be PAC-learnable. Our results can be extended easily to learn efficiently any Boolean combination of a polynomial number of concepts selected from any concept class C over R-d given that the VC-dimension of C has dependence only on d and there is a polynomial time algorithm to determine if there is a concept from C consistent with a given set of labeled examples. We also present a statistical query Version of our algorithm that can tolerate random classification noise. Finally we present a generalization of the standard epsilon-net result of Haussler and Welzl [1987] and apply it to give an alternative noise-tolerant algorithm for d = 2 based on geometric subdivisions.
引用
收藏
页码:863 / 890
页数:28
相关论文
共 50 条
  • [41] On the Complexity of Proper Distribution-Free Learning of Linear Classifiers
    Long, Philip M.
    Long, Raphael J.
    ALGORITHMIC LEARNING THEORY, VOL 117, 2020, 117 : 583 - 591
  • [42] Using distribution-free learning theory to analyze chunking
    Cohen, William W.
    Proceedings of the Biennial Conference of the Canadian Society for Computational Studies of Intelligence, 1990,
  • [43] HyperMatch: Noise-Tolerant Semi-Supervised Learning via Relaxed Contrastive Constraint
    Zhou, Beitong
    Lu, Jing
    Liu, Kerui
    Xu, Yunlu
    Cheng, Zhanzhan
    Niu, Yi
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 24017 - 24026
  • [44] Massive-Scale Genre Communities Learning Using a Noise-Tolerant Deep Architecture
    Zhang, Luming
    Ju, Xiaoming
    Yao, Yiyang
    Liu, Zhenguang
    IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 22 (09) : 2467 - 2478
  • [45] Partial Multilabel Learning Using Noise-Tolerant Broad Learning System With Label Enhancement and Dimensionality Reduction
    Qian, Wenbin
    Tu, Yanqiang
    Huang, Jintao
    Shu, Wenhao
    Cheung, Yiu-Ming
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (02) : 3758 - 3772
  • [46] A general and more general classes of distribution-free tests for ordered location alternatives
    Bhat, Sharada V.
    Patil, Aparna B.
    INTERNATIONAL JOURNAL OF AGRICULTURAL AND STATISTICAL SCIENCES, 2007, 3 (02): : 319 - 335
  • [47] A DISTRIBUTION-FREE TEST FOR THE 2 SAMPLE PROBLEM FOR GENERAL ALTERNATIVES
    SCHMID, F
    TREDE, M
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 1995, 20 (04) : 409 - 419
  • [48] DISTRIBUTION-FREE GEOMETRIC UPPER BOUND FOR PROBABILITY OF ERROR OF A MINIMUM DISTANCE CLASSIFIER
    VANOTTERLOO, PJ
    YOUNG, IT
    PATTERN RECOGNITION, 1978, 10 (04) : 281 - 286
  • [49] Consistent camera-invariant and noise-tolerant learning for unsupervised person re-identification
    Chen, Yiyu
    Fan, Zheyi
    Chen, Shuni
    IMAGE AND VISION COMPUTING, 2022, 123
  • [50] Noise-Tolerant Self-Supervised Learning for Audio-Visual Voice Activity Detection
    Kim, Ui-Hyun
    INTERSPEECH 2021, 2021, : 326 - 330