Using Neural Networks to Detect Fire from Overhead Images

被引:1
|
作者
Kurasinski, Lukas [1 ]
Tan, Jason [1 ]
Malekian, Reza [1 ]
机构
[1] Malmo Univ, Dept Comp Sci & Media Technol, S-20506 Malmo, Sweden
关键词
Neural networks; Fire detection; Datasets; Accuracy;
D O I
10.1007/s11277-023-10321-7
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
The use of artificial intelligence (AI) is increasing in our everyday applications. One emerging field within AI is image recognition. Research that has been devoted to predicting fires involves predicting its behaviour. That is, how the fire will spread based on environmental key factors such as moisture, weather condition, and human presence. The result of correctly predicting fire spread can help firefighters to minimise the damage, deciding on possible actions, as well as allocating personnel effectively in potentially fire prone areas to extinguish fires quickly. Using neural networks (NN) for active fire detection has proven to be exceptional in classifying smoke and being able to separate it from similar patterns such as clouds, ground, dust, and ocean. Recent advances in fire detection using NN has proved that aerial imagery including drones as well as satellites has provided great results in detecting and classifying fires. These systems are computationally heavy and require a tremendous amount of data. A NN model is inextricably linked to the dataset on which it is trained. The cornerstone of this study is based on the data dependencieds of these models. The model herein is trained on two separate datasets and tested on three dataset in total in order to investigate the data dependency. When validating the model on their own datasets the model reached an accuracy of 92% respectively 99%. In comparison to previous work where an accuracy of 94% was reached. During evaluation of separate datasets, the model performed around the 60% range in 5 out of 6 cases, with the outlier of 29% in one of the cases.
引用
收藏
页码:1085 / 1105
页数:21
相关论文
共 50 条
  • [31] Age and gender classification from face images using neural networks
    Nakano, M
    Yasukata, F
    Fukumi, M
    PROCEEDINGS OF THE SIXTH IASTED INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING, 2004, : 69 - 73
  • [32] STUDIES ON OBJECT RECOGNITION FROM DEGRADED IMAGES USING NEURAL NETWORKS
    RAVICHANDRAN, A
    YEGNANARAYANA, B
    NEURAL NETWORKS, 1995, 8 (03) : 481 - 488
  • [33] Using Convolutional Neural Networks to Analyze Function Properties from Images
    Lewenberg, Yoad
    Bachrach, Yoram
    Kash, Ian
    Key, Peter
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 4363 - 4364
  • [34] Applying Convolutional Neural Networks to Detect Natural Gas Leaks in Wellhead Images
    Melo, Roberlanio Oliveira
    Costa, M. G. F.
    Costa Filho, Cicero F. F.
    IEEE ACCESS, 2020, 8 (08): : 191775 - 191784
  • [35] Two-stream neural networks to detect manipulation of JPEG compressed images
    Kim, H. -G.
    Park, J. -S.
    Kim, D. -G.
    Lee, H. -K.
    ELECTRONICS LETTERS, 2018, 54 (06) : 354 - 355
  • [36] Classification of photorefraction images using neural networks
    Costa, MFM
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 1637 - 1642
  • [37] Digital watermarking of images using neural networks
    Hwang, MS
    Chang, CC
    Hwang, KF
    JOURNAL OF ELECTRONIC IMAGING, 2000, 9 (04) : 548 - 555
  • [38] Time-space analysis in photoelasticity images using recurrent neural networks to detect zones with stress concentration
    Brinez de Leon, Juan C.
    Restrepo M., Alejandro
    Branch, John W.
    APPLICATIONS OF DIGITAL IMAGE PROCESSING XXXIX, 2016, 9971
  • [39] Using teacher-student neural networks based on knowledge distillation to detect anomalous samples in the otolith images
    Chen, Yuwen
    Zhu, Guoping
    ZOOLOGY, 2023, 161
  • [40] Using Neural Networks to Detect Anomalies in X-Ray Images Obtained with Full-Body Scanners
    Markov, A. S.
    Kotlyarov, E. Yu.
    Anosova, N. P.
    Popov, V. A.
    Karandashev, Ya. M.
    Apushkinskaya, D. E.
    AUTOMATION AND REMOTE CONTROL, 2022, 83 (10) : 1507 - 1516