Scalable Batch Acquisition for Deep Bayesian Active Learning

被引:0
|
作者
Rubashevskii, Aleksandr [1 ]
Kotova, Dania [1 ]
Panov, Maxim [2 ]
机构
[1] Skolkovo Inst Sci & Technol, Moscow, Russia
[2] Technol Innovat Inst, Abu Dhabi, U Arab Emirates
来源
PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM | 2023年
基金
俄罗斯科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In deep active learning, it is especially important to choose multiple examples to markup at each step to work efficiently, especially on large datasets. At the same time, existing solutions to this problem in the Bayesian setup, such as BatchBALD, have significant limitations in selecting a large number of examples, associated with the exponential complexity of computing mutual information for joint random variables. We, therefore, present the Large BatchBALD algorithm, which gives a well-grounded approximation to the BatchBALD method that aims to achieve comparable quality while being more computationally efficient. We provide a complexity analysis of the algorithm, showing a reduction in computation time, especially for large batches. Furthermore, we present an extensive set of experimental results on image and text data, both on toy datasets and larger ones such as CIFAR-100.
引用
收藏
页码:739 / 747
页数:9
相关论文
共 50 条
  • [1] BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning
    Kirsch, Andreas
    van Amersfoort, Joost
    Gal, Yarin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [2] Deep Active Learning with Adaptive Acquisition
    Haussmann, Manuel
    Hamprecht, Fred
    Kandemir, Melih
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 2470 - 2476
  • [3] Bayesian Generative Active Deep Learning
    Toan Tran
    Thanh-Toan Do
    Reid, Ian
    Carneiro, Gustavo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [4] Beam Search Optimized Batch Bayesian Active Learning
    Sun, Jingyu
    Zhai, Hongjie
    Saisho, Osamu
    Takeuchi, Susumu
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 5, 2023, : 6084 - 6091
  • [5] Bayesian Batch Active Learning as Sparse Subset Approximation
    Pinsler, Robert
    Gordon, Jonathan
    Nalisnick, Eric
    Hernandez-Lobato, Jose Miguel
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [6] Deep Bayesian Active Learning with Image Data
    Gal, Yarin
    Islam, Riashat
    Ghahramani, Zoubin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [7] A Bayesian active learning platform for scalable combination drug screens
    Tosh, Christopher
    Tec, Mauricio
    White, Jessica B.
    Quinn, Jeffrey F.
    Sanchez, Glorymar Ibanez
    Calder, Paul
    Kung, Andrew L.
    Dela Cruz, Filemon S.
    Tansey, Wesley
    NATURE COMMUNICATIONS, 2025, 16 (01)
  • [8] NimbleLearn: A Scalable and Fast Batch-mode Active Learning Approach
    Kong, Ruoyan
    Qiu, Zhanlong
    Liu, Yang
    Zhao, Qi
    21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS ICDMW 2021, 2021, : 350 - 359
  • [9] Adaptive batch mode active learning with deep similarity
    Zhang, Kaiyuan
    Qian, Buyue
    Wei, Jishang
    Yin, Changchang
    Cao, Shilei
    Li, Xiaoyu
    Cao, Yanjun
    Zheng, Qinghua
    EGYPTIAN INFORMATICS JOURNAL, 2023, 24 (04)
  • [10] A Framework and Benchmark for Deep Batch Active Learning for Regression
    Holzmueller, David
    Zaverkin, Viktor
    Kaestner, Johannes
    Steinwart, Ingo
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24