Fan Blade Crack Detection Algorithm Based on Multi-Scale Feature Fusion

被引:0
|
作者
Qi, Yongjun [1 ,2 ]
Tang, Hailin [1 ,2 ]
Khuder, Altangerel [2 ]
机构
[1] Guangdong Baiyun Univ, Fac Megadata & Comp, Guangzhou 510450, Peoples R China
[2] Mongolian Univ Sci & Technol, Sch Informat & Commun Technol, Ulaanbaatar 13341, Bayanzurkh, Mongolia
来源
IEEE ACCESS | 2025年 / 13卷
关键词
Feature extraction; Blades; Fans; Convolutional neural networks; Accuracy; Long short term memory; Image recognition; Data mining; Wind turbines; Data models; SCADA database; neural network; LSTM feature extraction; wind turbine blades; DEEP NEURAL-NETWORKS; FEATURE-EXTRACTION;
D O I
10.1109/ACCESS.2025.3532077
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the rapid development of social economy, energy consumption is growing tremendously so green energy such as wind energy has become widely used, thus promoting the construction of wind turbines. Due to the long-term use of the electro-mechanical unit, the traditional maintenance cost is too high. In order to quickly and accurately detect and maintain the fan blades, based on the intelligent big data from the environment, we propose the convolutional neural network model to solve the problem of low recognition rate due to the lack of feature extraction in the fan blade crack image, and the long short-term memory network (Long Short-Term Memory, LSTM) convolutional neural network model, and the dimensionality reduction of the captured image data, which is beneficial to improve the recognition rate of the picture and reduce the loss rate of the picture through the detection model's suitable recognition of complex background problems such as target occlusion and overlap. Using LSTM to extract the global context module can effectively improve the target detection accuracy. When this part is added, the detection accuracy will increase by about 3% to 7%. The image position can be accurately captured and the recognition rate is greatly improved through the optimized convolutional neural network, which can provide a reference for future research in other fields.
引用
收藏
页码:15762 / 15772
页数:11
相关论文
共 50 条
  • [21] YOLO-GEA: infrared target detection algorithm based on multi-scale feature fusion
    Da, Mei
    Tao, Youfeng
    Jiang, Lin
    Hu, Jue
    Zhang, Zhijian
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2025, 36 (04)
  • [22] A Segmentation Algorithm of Colonoscopy Images Based on Multi-Scale Feature Fusion
    Yu, Jing
    Li, Zhengping
    Xu, Chao
    Feng, Bo
    ELECTRONICS, 2022, 11 (16)
  • [23] Source Camera Identification Algorithm Based on Multi-Scale Feature Fusion
    Lu, Jianfeng
    Li, Caijin
    Huang, Xiangye
    Cui, Chen
    Emam, Mahmoud
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 80 (02): : 3047 - 3065
  • [24] Multi-Scale Receptive Field Feature Fusion Algorithm based on MobileNet
    Huang Yukai
    Wang Qingwang
    Shen Tao
    Zhu Yan
    Song Jian
    LASER & OPTOELECTRONICS PROGRESS, 2023, 60 (04)
  • [25] Weed Detection Based on Multi-scale Fusion Module and Feature Enhancement
    Kang J.
    Liu G.
    Guo G.
    Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2022, 53 (04): : 254 - 260
  • [26] Surface Defect Detection Based on Adaptive Multi-Scale Feature Fusion
    Wen, Guochen
    Cheng, Li
    Yuan, Haiwen
    Li, Xuan
    SENSORS, 2025, 25 (06)
  • [27] Multi-Scale Feature Fusion Based Adaptive Object Detection for UAV
    Liu Fang
    Wu Zhiwei
    Yang Anzhe
    Han Xiao
    ACTA OPTICA SINICA, 2020, 40 (10)
  • [28] Loop Closure Detection Based on Multi-Scale Deep Feature Fusion
    Chen, Baifan
    Yuan, Dian
    Liu, Chunfa
    Wu, Qian
    APPLIED SCIENCES-BASEL, 2019, 9 (06):
  • [29] Grape clusters detection based on multi-scale feature fusion and augmentation
    Ma, Jinlin
    Xu, Silong
    Ma, Ziping
    Fu, Hong
    Lin, Baobao
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [30] Underwater image object detection based on multi-scale feature fusion
    Yang, Chao
    Zhang, Ce
    Jiang, Longyu
    Zhang, Xinwen
    MACHINE VISION AND APPLICATIONS, 2024, 35 (06)