Multi-Class Active Learning by Uncertainty Sampling with Diversity Maximization

被引:0
|
作者
Yi Yang
Zhigang Ma
Feiping Nie
Xiaojun Chang
Alexander G. Hauptmann
机构
[1] University of Technology Sydney,Centre for Quantum Computation and Intelligent Systems
[2] Carnegie Mellon University,School of Computer Science
[3] Northwestern Polytechnical University,The Center for OPTical IMagery Analysis and Learning
来源
关键词
Active learning; Uncertainty sampling; Diversity maximization;
D O I
暂无
中图分类号
学科分类号
摘要
As a way to relieve the tedious work of manual annotation, active learning plays important roles in many applications of visual concept recognition. In typical active learning scenarios, the number of labelled data in the seed set is usually small. However, most existing active learning algorithms only exploit the labelled data, which often suffers from over-fitting due to the small number of labelled examples. Besides, while much progress has been made in binary class active learning, little research attention has been focused on multi-class active learning. In this paper, we propose a semi-supervised batch mode multi-class active learning algorithm for visual concept recognition. Our algorithm exploits the whole active pool to evaluate the uncertainty of the data. Considering that uncertain data are always similar to each other, we propose to make the selected data as diverse as possible, for which we explicitly impose a diversity constraint on the objective function. As a multi-class active learning algorithm, our algorithm is able to exploit uncertainty across multiple classes. An efficient algorithm is used to optimize the objective function. Extensive experiments on action recognition, object classification, scene recognition, and event detection demonstrate its advantages.
引用
收藏
页码:113 / 127
页数:14
相关论文
共 50 条
  • [31] Multi-class learning from class proportions
    Wang, Zilei
    Feng, Jiashi
    NEUROCOMPUTING, 2013, 119 : 273 - 280
  • [32] Beyond Active Noun Tagging: Modeling Contextual Interactions for Multi-Class Active Learning
    Siddiquie, Behjat
    Gupta, Abhinav
    2010 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2010, : 2979 - 2986
  • [33] Multi-Class Learning by Smoothed Boosting
    Rong Jin
    Jian Zhang
    Machine Learning, 2007, 67 : 207 - 227
  • [34] Learning Multi-class Theories in ILP
    Abudawood, Tarek
    Flach, Peter A.
    INDUCTIVE LOGIC PROGRAMMING, ILP 2010, 2011, 6489 : 6 - 13
  • [35] Multi-class learning by smoothed boosting
    Jin, Rong
    Zhang, Jian
    MACHINE LEARNING, 2007, 67 (03) : 207 - 227
  • [36] DynaQ: online learning from imbalanced multi-class streams through dynamic sampling
    Farnaz Sadeghi
    Herna L. Viktor
    Parsa Vafaie
    Applied Intelligence, 2023, 53 : 24908 - 24930
  • [37] Stable Learning in Coding Space for Multi-Class Decoding and Its Extension for Multi-Class Hypothesis Transfer Learning
    Zhang, Bang
    Wang, Yi
    Wang, Yang
    Chen, Fang
    2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, : 1075 - 1081
  • [38] DynaQ: online learning from imbalanced multi-class streams through dynamic sampling
    Sadeghi, Farnaz
    Viktor, Herna L.
    Vafaie, Parsa
    APPLIED INTELLIGENCE, 2023, 53 (21) : 24908 - 24930
  • [39] Multi-class Active Learning: A Hybrid Informative and Representative Criterion Inspired Approach
    Wang, Zengmao
    Du, Bo
    Zhang, Lefei
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 1510 - 1517
  • [40] Integrating Bayesian and Discriminative Sparse Kernel Machines for Multi-class Active Learning
    Shi, Weishi
    Yu, Qi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32