ClickBAIT: Click-based Accelerated Incremental Training of Convolutional Neural Networks

被引:0
|
作者
Teng, Ervin [1 ]
Falcao, Joao Diogo [1 ]
Huang, Rui [1 ]
Iannucci, Bob [1 ]
机构
[1] Carnegie Mellon Univ, Dept Elect & Comp Engn, NASA Ames Res Pk,Bldg 23,MS 23-11, Moffett Field, CA 94035 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Training deep learning models in real-time, with a human in-the-loop, could allow them to be adjusted and adapted on-the-fly, as demanded by the mission at hand. For instance, during a tracking and surveillance operation, an unmanned aerial system (UAS) operator spots the subject, and quickly trains and distributes an object-specific recognition model to other drones and cameras while the situation is unfolding. We call this type of real-time training Time-ordered Online Training (ToOT). In this work, we present ClickBAIT, a framework for performing ToOT in real-time. ClickBAIT reduces the human effort required to perform training in real-time by reducing annotations to single clicks, and employing object tracking to produce additional training events. We implement ClickBAIT for both an image classifier and object detector, and show that it improves the training benefit of clicks from 3 to 7 times on representative video sequences.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Efficient Incremental Training for Deep Convolutional Neural Networks
    Tao, Yudong
    Tu, Yuexuan
    Shyu, Mei-Ling
    [J]. 2019 2ND IEEE CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2019), 2019, : 286 - 291
  • [2] Click-based covalent adaptable networks
    Kloxin, Christopher
    [J]. ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2018, 256
  • [3] Increasing Convolutional Neural Networks Training Speed by Incremental Complexity Learning
    Wanderley, Miguel D. S.
    Prudencio, Ricardo B. C.
    [J]. 2018 7TH BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 2018, : 103 - 108
  • [4] INCREMENTAL LEARNING OF CONVOLUTIONAL NEURAL NETWORKS
    Medera, Dusan
    Babinec, Stefan
    [J]. IJCCI 2009: PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON COMPUTATIONAL INTELLIGENCE, 2009, : 547 - +
  • [5] Guaranteed Convergence of Training Convolutional Neural Networks via Accelerated Gradient Descent
    Zhang, Shuai
    Wang, Meng
    Liu, Sijia
    Chen, Pin-Yu
    Xiong, Jinjun
    [J]. 2020 54TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2020, : 41 - 46
  • [6] An FPGA-Based Processor for Training Convolutional Neural Networks
    Liu, Zhiqiang
    Dou, Yong
    Jiang, Jingfei
    Wang, Qiang
    Chow, Paul
    [J]. 2017 INTERNATIONAL CONFERENCE ON FIELD PROGRAMMABLE TECHNOLOGY (ICFPT), 2017, : 207 - 210
  • [7] Latent Training for Convolutional Neural Networks
    Huang, Zi
    Liu, Qi
    Chen, Zhiyuan
    Zhao, Yuming
    [J]. PROCEEDINGS OF 2015 INTERNATIONAL CONFERENCE ON ESTIMATION, DETECTION AND INFORMATION FUSION ICEDIF 2015, 2015, : 55 - 60
  • [8] Comparing Incremental Learning Strategies for Convolutional Neural Networks
    Lomonaco, Vincenzo
    Maltoni, Davide
    [J]. ARTIFICIAL NEURAL NETWORKS IN PATTERN RECOGNITION, 2016, 9896 : 175 - 184
  • [9] clCaffe: OpenCL accelerated Caffe for Convolutional Neural Networks
    Bottleson, Jeremy
    Kim, SungYe
    Andrews, Jeff
    Bindu, Preeti
    Murthy, Deepak N.
    Jin, Jingyi
    [J]. 2016 IEEE 30TH INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS (IPDPSW), 2016, : 50 - 57
  • [10] Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
    Moldovan, Adrian
    Cataron, Angel
    Andonie, Razvan
    [J]. ENTROPY, 2021, 23 (09)