Improved deep learning based multi-index comprehensive evaluation system for the research on accuracy and real-time performance of coal gangue particles recognition

被引:0
|
作者
Zhang, Yao [1 ]
Yang, Yang [1 ,2 ,4 ]
Zeng, Qingliang [1 ,2 ,3 ]
机构
[1] Shandong Univ Sci & Technol, Coll Mech & Elect Engn, Qingdao, Peoples R China
[2] Shandong Prov Key Lab Min Mech Engn, Qingdao, Peoples R China
[3] Shandong Normal Univ, Coll Informat Sci & Engn, Jinan, Peoples R China
[4] Shandong Univ Sci & Technol, Coll Mech & Elect Engn, Qingdao 266590, Peoples R China
基金
中国国家自然科学基金;
关键词
coal gangue recognition; MICES; random impact test; real-time; recognition accuracy; TI-CNN; IMPACT-SLIP EXPERIMENTS; IDENTIFICATION; TECHNOLOGY; NETWORK; ROCK;
D O I
10.1002/ese3.1564
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
The problems of insufficient recognition accuracy, poor real-time performance and lack of consideration of actual working conditions in the process of intelligent construction of coal mines make this technology still in the research stage and not applied in practical engineering. The purpose of this paper is to establish an accurate and real-time recognition model, which can quickly distinguish the vibration acceleration signals of coal and gangue under the influence of external factors such as impact position, velocity, and direction by using the different physical properties of coal and gangue particles. Therefore, the accuracy and real-time of coal gangue recognition model established by different convolutional neural networks (CNN) structures and different position signal input are studied. First, to meet the real-time requirements, an original CNN recognition model composed of single convolution layer and single pooling layer is established, and the data collected by seven sensors are input in the form of two-dimensional matrix. However, the stability of the training and test results is insufficient. To solve this problem, once improved CNN (OI-CNN) recognition model with multiconvolution layers and multipooling layers is built by deepening the network. The experimental results show that the stability and accuracy are improved, but the real-time performance is poor. Furthermore, through parameter adjustment, the OI-CNN is changed to the twice improved CNN (TI-CNN), and the sensor data at different positions are input in the form of one-dimensional vectors. The results show that the accuracy and real-time performance of the TI-CNN coal gangue recognition model are further improved. Finally, according to the research purpose of this paper, the weights of CNN indexes are given, and a multi-index comprehensive evaluation system (MICES) is established. With the original CNN recognition model as the control, the OI-CNN recognition model and the TI-CNN recognition model at different positions are quantitatively compared to obtain the comprehensive evaluation scores of each model. The results show that the MICES of the coal gangue recognition model established based on the TI-CNN structure and the data input of a single position sensor is the highest, while the sensor position has little effect on the recognition results. Physical prototype of test bed and vibration acceleration signal acquisition system.image
引用
收藏
页码:4077 / 4091
页数:15
相关论文
共 45 条
  • [31] Facial emotion recognition based real-time learner engagement detection system in online learning context using deep learning models
    Gupta, Swadha
    Kumar, Parteek
    Tekchandani, Raj Kumar
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (08) : 11365 - 11394
  • [32] Facial emotion recognition based real-time learner engagement detection system in online learning context using deep learning models
    Swadha Gupta
    Parteek Kumar
    Raj Kumar Tekchandani
    Multimedia Tools and Applications, 2023, 82 : 11365 - 11394
  • [33] Real-time purchase behavior recognition system based on deep learning-based object detection and tracking for an unmanned product cabinet
    Kim, Dae Ha
    Lee, Seunghyun
    Jeon, Jungho
    Song, Byung Cheol
    EXPERT SYSTEMS WITH APPLICATIONS, 2020, 143
  • [34] A Wearable Real-Time Character Recognition System Based on Edge Computing-Enabled Deep Learning for Air-Writing
    Zhang, Hongyu
    Chen, Lichang
    Zhang, Yunhao
    Hu, Renjie
    He, Chunjuan
    Tan, Yaqing
    Zhang, Jiajin
    JOURNAL OF SENSORS, 2022, 2022
  • [35] A Single-Stage Deep Learning-based Approach for Real-Time License Plate Recognition in Smart Parking System
    Yu, Lina
    Liu, Shaokun
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (09) : 1142 - 1150
  • [36] Evaluation of Tracking Accuracy Using Synthetic Color Fluoroscopic Images Based On the Deep Learning for Real-Time Tumor-Tracking Radiotherapy
    Shiinoki, T.
    Yuasa, Y.
    Fujimoto, K.
    MEDICAL PHYSICS, 2019, 46 (06) : E124 - E124
  • [37] Real-Time Detection and Motion Recognition of Human Moving Objects Based on Deep Learning and Multi-Scale Feature Fusion in Video
    Gong, Meimei
    Shu, Yiming
    IEEE ACCESS, 2020, 8 : 25811 - 25822
  • [38] Applying deep learning to real-time UAV-based forest monitoring: Leveraging multi-sensor imagery for improved results
    Marques, Tomas
    Carreira, Samuel
    Miragaia, Rolando
    Ramos, Joao
    Pereira, Antonio
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 245
  • [39] A deep-learning based system using multi-modal data for diagnosing gastric neoplasms in real-time (with video)
    Hongliu Du
    Zehua Dong
    Lianlian Wu
    Yanxia Li
    Jun Liu
    Chaijie Luo
    Xiaoquan Zeng
    Yunchao Deng
    Du Cheng
    Wenxiu Diao
    Yijie Zhu
    Xiao Tao
    Junxiao Wang
    Chenxia Zhang
    Honggang Yu
    Gastric Cancer, 2023, 26 : 275 - 285
  • [40] A deep-learning based system using multi-modal data for diagnosing gastric neoplasms in real-time (with video)
    Du, Hongliu
    Dong, Zehua
    Wu, Lianlian
    Li, Yanxia
    Liu, Jun
    Luo, Chaijie
    Zeng, Xiaoquan
    Deng, Yunchao
    Cheng, Du
    Diao, Wenxiu
    Zhu, Yijie
    Tao, Xiao
    Wang, Junxiao
    Zhang, Chenxia
    Yu, Honggang
    GASTRIC CANCER, 2023, 26 (02) : 275 - 285