Adaptive batch mode active learning with deep similarity

被引:1
|
作者
Zhang, Kaiyuan [1 ]
Qian, Buyue [2 ]
Wei, Jishang [3 ]
Yin, Changchang [1 ]
Cao, Shilei [1 ]
Li, Xiaoyu [1 ]
Cao, Yanjun [4 ]
Zheng, Qinghua [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Elect & Informat Engn, Xian 710049, Shaanxi, Peoples R China
[2] Capital Med Univ, Beijing Chaoyang Hosp, Beijing 100020, Peoples R China
[3] HP Labs, 1501 Page Mill Rd, Palo Alto, CA 94304 USA
[4] Northwest Univ, Biomed Key Lab Shaanxi Prov, Xian 710069, Peoples R China
基金
中国国家自然科学基金;
关键词
Active learning; Adaptive batch mode active learning; Classification model; Deep neural network; Deep learning; CLASSIFICATION;
D O I
10.1016/j.eij.2023.100412
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Active learning is usually used in scenarios where few labels are available and manual labeling is expensive. To improve model performance, it is necessary to find the most valuable instance among all instances and label it to maximize the benefits of labeling. In practical scenarios, it is often more efficient to query a group of instances instead of a individual instance during each iteration. To achieve this goal, we need to explore the similarities between instances to ensure the informativeness and diversity. Many ad-hoc algorithms are proposed for batch mode active learning, and there are generally two major issues. One is that similarity measurement among in-stances often only relies on the expression of features but it is not well integrated with the classification algo-rithm model. This will cut down the precise measurement of diversity. The other is that in order to explore the decision boundary, these algorithms often choose the instance near the boundary. It is difficult to get the true boundary when there are few labeled instances. As a large number of instances continue to be labeled, infor-mation between instances is less used, and the performance will be greatly improved if it is properly used. In our work, we propose an adaptive algorithm based on deep neural networks to solve the two problems mentioned above. During the training phase, we established a paired network to improve the accuracy of the classification model, and the network can project the instance to a new feature space for more accurate similarity measure-ment. When batch labeling instances, we use the adaptive algorithm to select the instance by balancing the maximum uncertainty (exploration) and diversity (exploitation). Our algorithm has been validated for heart failure prediction tasks in real-world EHR datasets. Due to the no public of EHR data, we also conducted vali-dation on two other classic classification tasks. Our algorithm is superior to the baseline method in both accuracy and convergence rate.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Adaptive Batch Mode Active Learning
    Chakraborty, Shayok
    Balasubramanian, Vineeth
    Panchanathan, Sethuraman
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (08) : 1747 - 1760
  • [2] Deep Similarity-Based Batch Mode Active Learning with Exploration-Exploitation
    Yin, Changchang
    Qian, Buyue
    Cao, Shilei
    Li, Xiaoyu
    Wei, Jishang
    Zheng, Qinghua
    Davidson, Ian
    2017 17TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2017, : 575 - 584
  • [3] Batch mode active learning via adaptive criteria weights
    Hao Li
    Yongli Wang
    Yanchao Li
    Gang Xiao
    Peng Hu
    Ruxin Zhao
    Applied Intelligence, 2021, 51 : 3475 - 3489
  • [4] Batch mode active learning via adaptive criteria weights
    Li, Hao
    Wang, Yongli
    Li, Yanchao
    Xiao, Gang
    Hu, Peng
    Zhao, Ruxin
    APPLIED INTELLIGENCE, 2021, 51 (06) : 3475 - 3489
  • [5] Correction to: Batch mode active learning via adaptive criteria weights
    Hao Li
    Yongli Wang
    Yanchao Li
    Gang Xiao
    Peng Hu
    Ruxin Zhao
    Applied Intelligence, 2021, 51 (6) : 3490 - 3490
  • [6] Exploring chemical and conformational spaces by batch mode deep active learning
    Zaverkin, Viktor
    Holzmueller, David
    Steinwart, Ingo
    Kaestner, Johannes
    DIGITAL DISCOVERY, 2022, 1 (05): : 605 - 620
  • [7] Batch Mode Active Learning with Nonlocal Self-Similarity Prior for Semantic Segmentation
    Tan, Yao
    Hu, Qinghua
    Du, Zhibin
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [8] Dynamic Batch Mode Active Learning
    Chakraborty, Shayok
    Balasubramanian, Vineeth
    Panchanathan, Sethuraman
    2011 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2011,
  • [9] Batch Mode Active Learning for Networked Data
    Shi, Lixin
    Zhao, Yuhang
    Tang, Jie
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2012, 3 (02)
  • [10] Batch Mode Active Learning for Biometric Recognition
    Chakraborty, Shayok
    Balasubramanian, Vineeth
    Panchanathan, Sethuraman
    BIOMETRIC TECHNOLOGY FOR HUMAN IDENTIFICATION VII, 2010, 7667