Employing Convolutional Neural Networks for Continual Learning

被引:1
|
作者
Jasinski, Marcin [1 ]
Wozniak, Michal [1 ]
机构
[1] Wroclaw Univ Sci & Technol, Dept Syst & Comp Networks, Wybrzeze Wyspianskiego 27, PL-50370 Wroclaw, Poland
关键词
Continous learning; Concept drift; Bayesian convolutional neural network; Deep learning;
D O I
10.1007/978-3-031-23492-7_25
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The main motivation for the presented research was to investigate the behavior of different convolutional neural network architectures in the analysis of non-stationary data streams. Learning a model on continuously incoming data is different from learning where a complete learning set is immediately available. However, streaming data is definitely closer to reality, as nowadays, most data needs to be analyzed as soon as it arrives (e.g., in the case of anti-fraud systems, cybersecurity, and analysis of images from on-board cameras and other sensors). Besides the vital aspect related to the limitations of computational and memory resources that the proposed algorithms must consider, one of the critical difficulties is the possibility of concept drift. This phenomenon means that the probabilistic characteristics of the considered task change, and this, in consequence, may lead to a significant decrease in classification accuracy. This paper pays special attention to models of convolutional neural networks based on probabilistic methods: Monte Carlo dropout and Bayesian convolutional neural networks. Of particular interest was the aspect related to the uncertainty of predictions returned by the model. Such a situation may occur mainly during the classification of drifting data streams. Under such conditions, the prediction system should be able to return information about the high uncertainty of predictions and the need to take action to update the model used. This paper aims to study the behavior of the network of the models mentioned above in the task of classification of non-stationary data streams and to determine the impact of the occurrence of a sudden drift on the accuracy and uncertainty of the predictions.
引用
收藏
页码:288 / 297
页数:10
相关论文
共 50 条
  • [1] CONTINUAL LEARNING ON FACIAL RECOGNITION USING CONVOLUTIONAL NEURAL NETWORKS
    Feng, Jingjing
    Gomez, Valentina
    [J]. UPB Scientific Bulletin, Series C: Electrical Engineering and Computer Science, 2023, 85 (03): : 239 - 248
  • [2] CONTINUAL LEARNING ON FACIAL RECOGNITION USING CONVOLUTIONAL NEURAL NETWORKS
    Feng, Jingjing
    Gomez, Valentina
    [J]. UNIVERSITY POLITEHNICA OF BUCHAREST SCIENTIFIC BULLETIN SERIES C-ELECTRICAL ENGINEERING AND COMPUTER SCIENCE, 2023, 85 (03): : 239 - 248
  • [3] Continual Learning with Neural Networks: A Review
    Awasthi, Abhijeet
    Sarawagi, Sunita
    [J]. PROCEEDINGS OF THE 6TH ACM IKDD CODS AND 24TH COMAD, 2019, : 362 - 365
  • [4] Convolutional Neural Network With Developmental Memory for Continual Learning
    Park, Gyeong-Moon
    Yoo, Sahng-Min
    Kim, Jong-Hwan
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (06) : 2691 - 2705
  • [5] Continual Learning Using Bayesian Neural Networks
    Li, Honglin
    Barnaghi, Payam
    Enshaeifare, Shirin
    Ganz, Frieder
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (09) : 4243 - 4252
  • [6] Continual robot learning with constructive neural networks
    Grossmann, A
    Poli, R
    [J]. LEARNING ROBOTS, PROCEEDINGS, 1998, 1545 : 95 - 108
  • [7] Sparse Progressive Neural Networks for Continual Learning
    Ergun, Esra
    Toreyin, Behcet Ugur
    [J]. ADVANCES IN COMPUTATIONAL COLLECTIVE INTELLIGENCE (ICCCI 2021), 2021, 1463 : 715 - 725
  • [8] Continual Learning with Sparse Progressive Neural Networks
    Ergun, Esra
    Toreyin, Behcet Ugur
    [J]. 2020 28TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2020,
  • [9] Continual lifelong learning with neural networks: A review
    Parisi, German I.
    Kemker, Ronald
    Part, Jose L.
    Kanan, Christopher
    Wermter, Stefan
    [J]. NEURAL NETWORKS, 2019, 113 : 54 - 71
  • [10] Efficient continual learning in neural networks with embedding regularization
    Pomponi, Jary
    Scardapane, Simone
    Lomonaco, Vincenzo
    Uncini, Aurelio
    [J]. NEUROCOMPUTING, 2020, 397 : 139 - 148