Mobile-Unet: An efficient convolutional neural network for fabric defect detection

被引:140
|
作者
Jing, Junfeng [1 ]
Wang, Zhen [1 ]
Ratsch, Matthias [1 ,2 ]
Zhang, Huanhuan [1 ]
机构
[1] Xian Polytech Univ, Coll Elect & Informat, Xian, Peoples R China
[2] Reutlingen Univ, Coll Engn, Reutlingen, Germany
基金
中国国家自然科学基金;
关键词
fabric defect; deep learning; Mobile-Unet; efficient convolutional neural network; CLASSIFICATION;
D O I
10.1177/0040517520928604
中图分类号
TB3 [工程材料学]; TS1 [纺织工业、染整工业];
学科分类号
0805 ; 080502 ; 0821 ;
摘要
Deep learning-based fabric defect detection methods have been widely investigated to improve production efficiency and product quality. Although deep learning-based methods have proved to be powerful tools for classification and segmentation, some key issues remain to be addressed when applied to real applications. Firstly, the actual fabric production conditions of factories necessitate higher real-time performance of methods. Moreover, fabric defects as abnormal samples are very rare compared with normal samples, which results in data imbalance. It makes model training based on deep learning challenging. To solve these problems, an extremely efficient convolutional neural network, Mobile-Unet, is proposed to achieve the end-to-end defect segmentation. The median frequency balancing loss function is used to overcome the challenge of sample imbalance. Additionally, Mobile-Unet introduces depth-wise separable convolution, which dramatically reduces the complexity cost and model size of the network. It comprises two parts: encoder and decoder. The MobileNetV2 feature extractor is used as the encoder, and then five deconvolution layers are added as the decoder. Finally, the softmax layer is used to generate the segmentation mask. The performance of the proposed model has been evaluated by public fabric datasets and self-built fabric datasets. In comparison with other methods, the experimental results demonstrate that segmentation accuracy and detection speed in the proposed method achieve state-of-the-art performance.
引用
收藏
页码:30 / 42
页数:13
相关论文
共 50 条
  • [1] Fabric defect detection based on separate convolutional UNet
    Le Cheng
    Jizheng Yi
    Aibin Chen
    Yi Zhang
    [J]. Multimedia Tools and Applications, 2023, 82 : 3101 - 3122
  • [2] Fabric defect detection based on separate convolutional UNet
    Cheng, Le
    Yi, Jizheng
    Chen, Aibin
    Zhang, Yi
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (02) : 3101 - 3122
  • [3] Fabric Defect Detection Using Deep Convolutional Neural Network
    Maheshwari S. Biradar
    B. G. Shiparamatti
    P. M. Patil
    [J]. Optical Memory and Neural Networks, 2021, 30 : 250 - 256
  • [4] Fabric Defect Detection Using Deep Convolutional Neural Network
    Biradar, Maheshwari S.
    Shiparamatti, B. G.
    Patil, P. M.
    [J]. OPTICAL MEMORY AND NEURAL NETWORKS, 2021, 30 (03) : 250 - 256
  • [5] Automatic fabric defect detection using a deep convolutional neural network
    Jing, Jun-Feng
    Ma, Hao
    Zhang, Huan-Huan
    [J]. COLORATION TECHNOLOGY, 2019, 135 (03) : 213 - 223
  • [6] Deep Convolutional Neural Network Optimization for Defect Detection in Fabric Inspection
    Ho, Chao-Ching
    Chou, Wei-Chi
    Su, Eugene
    [J]. SENSORS, 2021, 21 (21)
  • [7] A real-time and accurate convolutional neural network for fabric defect detection
    Xueshen Li
    Yong Zhu
    [J]. Complex & Intelligent Systems, 2024, 10 : 3371 - 3387
  • [8] Automatic defect detection for fabric printing using a deep convolutional neural network
    Chakraborty, Samit
    Moore, Marguerite
    Parrillo-Chapman, Lisa
    [J]. INTERNATIONAL JOURNAL OF FASHION DESIGN TECHNOLOGY AND EDUCATION, 2022, 15 (02) : 142 - 157
  • [9] A real-time and accurate convolutional neural network for fabric defect detection
    Li, Xueshen
    Zhu, Yong
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (03) : 3371 - 3387
  • [10] Fabric Defect Detection Using Activation Layer Embedded Convolutional Neural Network
    Ouyang, Wenbin
    Xu, Bugao
    Hou, Jue
    Yuan, Xiaohui
    [J]. IEEE ACCESS, 2019, 7 : 70130 - 70140