Image restoration fabric defect detection based on the dual generative adversarial network patch model

被引:3
|
作者
Cheng, Haoming [1 ]
Liang, Jiuzhen [1 ,2 ,3 ]
Liu, Hao [1 ]
机构
[1] Changzhou Univ, Sch Informat & Engn, Changzhou, Jiangsu, Peoples R China
[2] Jiangnan Univ, Sch Digital Media, Wuxi, Jiangsu, Peoples R China
[3] Changzhou Univ, 21 Gehu Middle Rd, Changzhou 213164, Jiangsu, Peoples R China
关键词
Defect detection; generative adversarial networks; patch model; self-attention mechanism; AUTOMATED INSPECTION; CLASSIFICATION;
D O I
10.1177/00405175221144777
中图分类号
TB3 [工程材料学]; TS1 [纺织工业、染整工业];
学科分类号
0805 ; 080502 ; 0821 ;
摘要
The training of supervised learning requires the use of ground truth, which is difficult to obtain in large quantities in production practice. Unsupervised learning requires only flawless and anomalous images of fabrics, but inevitably generates a great deal of background noise when performing result generation, which reduces the quality of results. To overcome these limitations, we propose a new approach: image restoration fabric defect detection based on the dual generative adversarial network patch model (DGPM). We train with a modified generative adversarial network using only flawless and anomalous images of the fabric. We propose the patch model to directly obtain specific information about fabric defects and add a self-attentive model to reduce the generation of background noise. The performance of the DGPM is evaluated on box-, star-, and dot-patterned fabric databases. The true positive rate (TPR) of the box type is 81. 56% and the f-measure is 62.69%, the TPR of the dot type is 83.72% and the f-measure is 67.33%, and the TPR of the star type is 79.79% and the f-measure is 64.65%.
引用
收藏
页码:2859 / 2876
页数:18
相关论文
共 50 条
  • [21] Diversifying Tire-Defect Image Generation Based on Generative Adversarial Network
    Zhang, Yulong
    Wang, Yilin
    Jiang, Zhiqiang
    Liao, Fagen
    Zheng, Li
    Tan, Dongzeng
    Chen, Jinshui
    Lu, Jiangang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [22] Image and Graph Restoration Dependent on Generative Adversarial Network Algorithm
    Cao, Yuanhao
    TEHNICKI VJESNIK-TECHNICAL GAZETTE, 2021, 28 (06): : 1820 - 1824
  • [23] Fabric Defect Detection by Applying Structural Similarity Index to the Combination of Variational Autoencode and Generative Adversarial Network
    Lee, Chin-Feng
    Chang, Ting-Chia
    2021 INTERNATIONAL CONFERENCE ON SECURITY AND INFORMATION TECHNOLOGIES WITH AI, INTERNET COMPUTING AND BIG-DATA APPLICATIONS, 2023, 314 : 236 - 246
  • [24] Dual attention and channel transformer based generative adversarial network for restoration of the damaged artwork
    Kumar, Praveen
    Gupta, Varun
    Grover, Manan
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 128
  • [25] Restoration of damaged artworks based on a generative adversarial network
    Kumar, Praveen
    Gupta, Varun
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (26) : 40967 - 40985
  • [26] Large-area damage image restoration algorithm based on generative adversarial network
    Liu, Gang
    Li, Xiaofeng
    Wei, Jin
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (10): : 4651 - 4661
  • [27] Restoration of damaged artworks based on a generative adversarial network
    Praveen Kumar
    Varun Gupta
    Multimedia Tools and Applications, 2023, 82 : 40967 - 40985
  • [28] Research on Embroidery Image Restoration Based on Improved Deep Convolutional Generative Adversarial Network
    Liu Yixuan
    Ge Guangying
    Qi Zhenling
    Li Zhenxuan
    Sun Fulin
    LASER & OPTOELECTRONICS PROGRESS, 2023, 60 (20)
  • [29] Blind restoration of astronomical image based on deep attention generative adversarial neural network
    Luo, Lin
    Bao, Jiaqi
    Li, Jinlong
    Gao, Xiaorong
    OPTICAL ENGINEERING, 2022, 61 (01)
  • [30] Motion Defocus Infrared Image Restoration Based on Multi Scale Generative Adversarial Network
    Yi Shi
    Wu Zhijuan
    Zhu Jingming
    Li Xinrong
    Yuan Xuesong
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2020, 42 (07) : 1766 - 1773