An Algorithm for the Recognition of Motion-Blurred QR Codes Based on Generative Adversarial Networks and Attention Mechanisms

被引:0
|
作者
Hao Dong
Haibin Liu
Mingfei Li
Fujie Ren
Feng Xie
机构
[1] Beijing University Of Technology,Faculty of Materials and Manufacturing
关键词
QR code identification; Motion deblurring; Generative adversarial network; Attention mechanism;
D O I
暂无
中图分类号
学科分类号
摘要
Motion blur can easily affect the quality of QR code image, making it difficult to recognize QR codes on moving objects. This paper proposes an algorithm for the recognition of motion-blurred QR codes based on generative adversarial network and attention mechanism. Firstly, a multi-scale feature extraction framework for motion defuzzification is designed using deep convolutional neural networks, and enhanced multi-scale residual blocks and multi-scale feature extraction modules are utilized to capture rich local and global features. Secondly, the efficient channel attention module is added to strengthen the weights of effective features and suppress invalid features by modeling the correlations between channels. In addition, training stability is achieved through the use of the WGAN-div loss function, leading to the generation of higher quality samples. Finally, the proposed algorithm is evaluated through qualitative and quantitative comparisons with several recent methods on both the GOPRO public dataset and a self-constructed QR code dataset, respectively. The experimental results demonstrate that, compared to the other methods, the proposed algorithm has shown significant improvements in both processing time and recognition accuracy when dealing with the task of severe motion-blurred QR code recognition.
引用
下载
收藏
相关论文
共 50 条
  • [21] Image Translation with Attention Mechanism based on Generative Adversarial Networks
    Lu, Yu
    Liu, Ju
    Zhao, Xueyin
    Liu, Xiaoxi
    Chen, Weiqiang
    Gao, Xuesong
    IEEE INFOCOM 2020 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (INFOCOM WKSHPS), 2020, : 364 - 369
  • [22] Image Hazing Algorithm Based on Generative Adversarial Networks
    Xiao, Jinsheng
    Zhou, Jinglong
    Lei, Junfeng
    Xu, Chuan
    Sui, Haigang
    IEEE ACCESS, 2020, 8 : 15883 - 15894
  • [23] An image translation algorithm based on Generative Adversarial Networks
    Chen, Ruiying
    Liu, Long
    Luo, Zhuo
    2021 3RD INTERNATIONAL CONFERENCE ON MACHINE LEARNING, BIG DATA AND BUSINESS INTELLIGENCE (MLBDBI 2021), 2021, : 369 - 373
  • [24] Knowledge-Grounded Chatbot Based on Dual Wasserstein Generative Adversarial Networks with Effective Attention Mechanisms
    Kim, Sihyung
    Kwon, Oh-Woog
    Kim, Harksoo
    APPLIED SCIENCES-BASEL, 2020, 10 (09):
  • [25] Offline handwritten signature recognition based on generative adversarial networks
    Jiang, Xiaoguang
    INTERNATIONAL JOURNAL OF BIOMETRICS, 2024, 16 (3-4) : 236 - 255
  • [26] Attention mechanism enhancement algorithm based on cycle consistent generative adversarial networks for single image dehazing
    Liu, Yan
    Al-Shehari, Hassan
    Zhang, Hongying
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2022, 83
  • [27] Attention mechanism-based generative adversarial networks for image cartoonization
    Zhao, Wenqing
    Zhu, Jianlin
    Li, Ping
    Huang, Jin
    Tang, Junwei
    VISUAL COMPUTER, 2024, 40 (06): : 3971 - 3984
  • [28] Generative adversarial networks with multi-scale and attention mechanisms for underwater image enhancement
    Wang, Ziyang
    Zhao, Liquan
    Zhong, Tie
    Jia, Yanfei
    Cui, Ying
    FRONTIERS IN MARINE SCIENCE, 2023, 10
  • [29] Image Motion Blur Removal Algorithm Based on Generative Adversarial Network
    Kim, Jongchol
    Kim, Myongchol
    Kim, Insong
    Han, Gyongwon
    Jong, Myonghak
    Ri, Gwuangwon
    PROGRAMMING AND COMPUTER SOFTWARE, 2024, 50 (05) : 403 - 415
  • [30] Anomaly Recognition Algorithm for Human Multipose Motion Behavior Using Generative Adversarial Network
    Zhang, Nan
    Ren, Jie
    Xu, Qixiao
    Wu, Hao
    Wang, Meng
    WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2022, 2022