Deep Active Learning by Leveraging Training Dynamics

被引:0
|
作者
Wang, Haonan [1 ]
Huang, Wei [2 ]
Wu, Ziwei [1 ]
Margenot, Andrew [1 ]
Tong, Hanghang [1 ]
He, Jingrui [1 ]
机构
[1] Univ Illinois, Champaign, IL 61820 USA
[2] Univ New South Wales, Sydney, NSW, Australia
基金
美国国家科学基金会; 美国食品与农业研究所;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Active learning theories and methods have been extensively studied in classical statistical learning settings. However, deep active learning, i.e., active learning with deep learning models, is usually based on empirical criteria without solid theoretical justification, thus suffering from heavy doubts when some of those fail to provide benefits in real applications. In this paper, by exploring the connection between the generalization performance and the training dynamics, we propose a theory-driven deep active learning method (dynamicAL) which selects samples to maximize training dynamics. In particular, we prove that the convergence speed of training and the generalization performance are positively correlated under the ultra-wide condition and show that maximizing the training dynamics leads to better generalization performance. Furthermore, to scale up to large deep neural networks and data sets, we introduce two relaxations for the subset selection problem and reduce the time complexity from polynomial to constant. Empirical results show that dynamicAL not only outperforms the other baselines consistently but also scales well on large deep learning models. We hope our work would inspire more attempts on bridging the theoretical findings of deep networks and practical impacts of deep active learning in real applications.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] TiDAL: Learning Training Dynamics for Active Learning
    Kye, Seong Min
    Choi, Kwanghee
    Byun, Hyeongmin
    Chang, Buru
    [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 22278 - 22288
  • [2] Leveraging Crowdsourcing Data For Deep Active Learning An Application: Learning Intents in Alexa
    Yang, Jie
    Drake, Thomas
    Damianou, Andreas
    Maarek, Yoelle
    [J]. WEB CONFERENCE 2018: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW2018), 2018, : 23 - 32
  • [3] Active Training Trajectory Generation for Inverse Dynamics Model Learning with Deep Neural Networks
    Zhou, Siqi
    Schoellig, Angela P.
    [J]. 2019 IEEE 58TH CONFERENCE ON DECISION AND CONTROL (CDC), 2019, : 1784 - 1790
  • [4] Pruning by leveraging training dynamics
    Apostol, Andrei C.
    Stol, Maarten C.
    Forre, Patrick
    [J]. AI COMMUNICATIONS, 2022, 35 (02) : 65 - 85
  • [5] Leveraging Weather Dynamics in Insurance Claims Triage Using Deep Learning
    Shi, Peng
    Zhang, Wei
    Shi, Kun
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024, 119 (546) : 825 - 838
  • [6] Leveraging Active and Continual Learning for Improving Deep Face Recognition in-the-Wild
    Tosidis, Pavlos
    Passalis, Nikolaos
    Tefas, Anastasios
    [J]. 2023 IEEE 25TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING, MMSP, 2023,
  • [7] When Deep Learners Change Their Mind: Learning Dynamics for Active Learning
    Zolfaghari Bengar, Javad
    Raducanu, Bogdan
    van de Weijer, Joost
    [J]. COMPUTER ANALYSIS OF IMAGES AND PATTERNS, CAIP 2021, PT 1, 2021, 13052 : 403 - 413
  • [8] Deep Active Learning: Unified and Principled Method for Query and Training
    Shui, Changjian
    Zhou, Fan
    Gagne, Christian
    Wang, Boyu
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [9] LEVERAGING DEEP REINFORCEMENT LEARNING FOR ACTIVE SHOOTING UNDER OPEN-WORLD SETTING
    Tzimas, A.
    Passalis, N.
    Tefas, A.
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2020,
  • [10] Discovering misannotated lncRNAs using deep learning training dynamics
    Nabi, Afshan
    Dilekoglu, Berke
    Adebali, Ogun
    Tastan, Oznur
    [J]. BIOINFORMATICS, 2023, 39 (01)