Turbulent flame image classification using Convolutional Neural Networks

被引:8
|
作者
Roncancio, Rathziel [1 ]
El Gamal, Aly [2 ]
Gore, Jay P. [1 ]
机构
[1] Purdue Univ, Sch Mech Engn, W Lafayette, IN 47907 USA
[2] Purdue Univ, Sch Elect & Comp Engn, W Lafayette, IN 47907 USA
关键词
CNN; Flame; Neural network; Turbulent; PREMIXED FLAMES; LOCAL FLAME; OH-PLIF; COMBUSTION; MODEL; CH;
D O I
10.1016/j.egyai.2022.100193
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pockets of unburned material in turbulent premixed flames burning CH4, air, and CO2 were studied using OH Planar Laser-Induced Fluorescence (PLIF) images to improve current understanding. Such flames are ubiquitous in most natural gas air combustors running gas turbines with dry exhaust gas recirculation (EGR) for land-based power generation. Essential improvements continue in the characterization and understanding of turbulent flames with EGR particularly for transient events like ignition and extinction. Pockets and/or islands of unburned material within burned and unburned turbulent media are some of the features of these events. These features reduce the heat release rates and increase the carbon monoxide and hydrocarbons emissions. The present work involves Convolutional Neural Networks (CNN) based classification of PLIF images containing unburned pockets in three turbulent flames with 0%, 5%, and 10% CO2. The CNN model was constructed using three convolutional layers and two fully connected layers using dropout and weight decay. Accuracies of 94.2%, 92.3% and 89.2% were registered for the three flames, respectively. The present approach represents significant computational time savings with respect to conventional image processing methods.
引用
收藏
页数:8
相关论文
共 50 条
  • [21] Breast Cancer Histopathological Image Classification using Convolutional Neural Networks
    Spanhol, Fabio Alexandre
    Oliveira, Luiz S.
    Petitjean, Caroline
    Heutte, Laurent
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 2560 - 2567
  • [22] Breast Ultrasound Image Classification and Segmentation Using Convolutional Neural Networks
    Xie, Xiaozheng
    Shi, Faqiang
    Niu, Jianwei
    Tang, Xiaolan
    ADVANCES IN MULTIMEDIA INFORMATION PROCESSING, PT III, 2018, 11166 : 200 - 211
  • [23] Image Classification Using Convolutional Neural Networks with Different Convolution Operations
    Hsu, Chi-Yi
    Tseng, Chien-Cheng
    Lee, Su-Ling
    Xiao, Bing-Yu
    2020 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - TAIWAN (ICCE-TAIWAN), 2020,
  • [24] CLASSIFICATION OF MICROCHANNEL FLAME REGIMES BASED ON CONVOLUTIONAL NEURAL NETWORKS
    Isfahani, Seyed Navid Roohani
    Sauer, Vinicius M.
    Schoegl, Ingmar
    PROCEEDINGS OF THE ASME 2021 POWER CONFERENCE (POWER2021), 2021,
  • [25] Improved Convolutional Neural Networks for Hyperspectral Image Classification
    Kalita, Shashanka
    Biswas, Mantosh
    RECENT DEVELOPMENTS IN MACHINE LEARNING AND DATA ANALYTICS, 2019, 740 : 397 - 410
  • [26] Bag of Tricks for Image Classification with Convolutional Neural Networks
    He, Tong
    Zhang, Zhi
    Zhang, Hang
    Zhang, Zhongyue
    Xie, Junyuan
    Li, Mu
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 558 - 567
  • [27] Mirror invariant convolutional neural networks for image classification
    Lu, Shufang
    Li, Yan
    Wang, Minqian
    Gao, Fei
    IET IMAGE PROCESSING, 2022, 16 (06) : 1626 - 1635
  • [28] Analysis of Convolutional Neural Networks for Document Image Classification
    Tensmeyer, Chris
    Martinez, Tony
    2017 14TH IAPR INTERNATIONAL CONFERENCE ON DOCUMENT ANALYSIS AND RECOGNITION (ICDAR), VOL 1, 2017, : 388 - 393
  • [29] Deformable Convolutional Neural Networks for Hyperspectral Image Classification
    Zhu, Jian
    Fang, Leyuan
    Ghamisi, Pedram
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2018, 15 (08) : 1254 - 1258
  • [30] Hierarchical convolutional neural networks for fashion image classification
    Seo, Yian
    Shin, Kyung-shik
    EXPERT SYSTEMS WITH APPLICATIONS, 2019, 116 : 328 - 339