Regimes of No Gain in Multi-class Active Learning

被引:0
|
作者
Yuan, Gan [1 ]
Zhao, Yunfan [2 ]
Kpotufe, Samory [1 ]
机构
[1] Columbia Univ City New York, Dept Stat, New York, NY 10027 USA
[2] Columbia Univ City New York, Dept Ind Engn & Operat Res, New York, NY 10027 USA
关键词
active learning; margin conditions; minimax lower bound; multi-class classi- fication; non-parameteric classification; BOUNDS; RATES; RISK;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider nonparametric classification with smooth regression functions, where it is well known that notions of margin in P ( Y = y | X = x ) determine fast or slow rates in both active and passive learning. Here we elucidate a striking distinction-most relevant in multi-class settings-between active and passive learning. Namely, we show that some seemingly benign nuances in notions of margin-involving the uniqueness of the Bayes classes, which have no apparent effect on rates in passive learning-determine whether or not any active learner can outperform passive learning rates. While a shorter conference version of this work already alluded to these nuances, it focused on the binary case and thus failed to be conclusive as to the source of difficulty in the multi-class setting: we show here that it suffices that the Bayes classifier fails to be unique, as opposed to needing all classes to be Bayes optimal , for active learning to yield no gain over passive learning. More precisely, we show that for Tsybakov's margin condition (allowing general situations with non-unique Bayes classifiers), no active learner can gain over passive learning in terms of worst-case rate in commonly studied settings where the marginal on X is near uniform. Our results thus negate the usual intuition from past literature that active rates should improve over passive rates in nonparametric classification; as such these nuances allow to better characterize the actual sources of gain in active over passive learning.
引用
收藏
页数:31
相关论文
共 50 条
  • [41] ERM learning algorithm for multi-class classification
    Wang, Cheng
    Guo, Zheng-Chu
    APPLICABLE ANALYSIS, 2012, 91 (07) : 1339 - 1349
  • [42] Transductive Learning with Multi-class Volume Approximation
    Niu, Gang
    Dai, Bo
    du Plessis, Marthinus Christoffel
    Sugiyama, Masashi
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 1377 - 1385
  • [43] Robust Multi-Class Transductive Learning with Graphs
    Liu, Wei
    Chang, Shih-Fu
    CVPR: 2009 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-4, 2009, : 381 - 388
  • [44] Learning to Bound the Multi-Class Bayes Error
    Sekeh, Salimeh Yasaei
    Oselio, Brandon
    Hero, Alfred O.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 3793 - 3807
  • [45] Multi-class discreviminant learning for image retrieval
    Tao, DC
    Tang, XO
    CISST '04: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON IMAGING SCIENCE, SYSTEMS, AND TECHNOLOGY, 2004, : 452 - 455
  • [46] ON MULTI-CLASS COST-SENSITIVE LEARNING
    Zhou, Zhi-Hua
    Liu, Xu-Ying
    COMPUTATIONAL INTELLIGENCE, 2010, 26 (03) : 232 - 257
  • [47] Building hierarchical class structures for extreme multi-class learning
    Hongzhi Huang
    Yu Wang
    Qinghua Hu
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 2575 - 2590
  • [48] Building hierarchical class structures for extreme multi-class learning
    Huang, Hongzhi
    Wang, Yu
    Hu, Qinghua
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (07) : 2575 - 2590
  • [49] A Novel Incremental Class Learning Technique for Multi-class Classification
    Er, Meng Joo
    Yalavarthi, Vijaya Krishna
    Wang, Ning
    Venkatesan, Rajasekar
    ADVANCES IN NEURAL NETWORKS - ISNN 2016, 2016, 9719 : 474 - 481
  • [50] Multi-class Multi-scale Stacked Sequential Learning
    Puertas, Eloi
    Escalera, Sergio
    Pujol, Oriol
    MULTIPLE CLASSIFIER SYSTEMS, 2011, 6713 : 197 - 206