Traffic signs recognition based on visual attention mechanism

被引:0
|
作者
Guo H.-R. [1 ,2 ]
Wang X.-J. [1 ]
Zhong Y.-X. [1 ]
Lu P. [1 ]
机构
[1] Center for Intelligence Science and Technology, Beijing University of Posts and Telecommunications
[2] College of Computer Science and Technology, Henan Polytechnic University
基金
高等学校博士学科点专项科研基金; 中国国家自然科学基金;
关键词
CIELab color space; dynamic threshold; SIFT; traffic signs; visual attention;
D O I
10.1016/S1005-8885(10)60139-2
中图分类号
学科分类号
摘要
Abstract: In order to improve the speed and accuracy of traffic signs recognition (TSR), a novel method based on visual attention mechanism is present in this paper. According to this method, the potential regions of traffic signs could be detected by extracting color contrast features in the CIELab color space based on visual attention model, meanwhile dynamic thresholds are used to adjust the timing of the inhibition of return. Then scale invariant feature transform (SIFT) feature is extracted for the candidate region, the similarity calculation is done between the candidate region and the standard traffic signs, thus, the traffic sign would be recognized. Compared with traditional methods, the proposed method can achieve rapid and accurate detection and recognition and has the characteristics of no need for segmentation, less calculation, and bionics. The experiments results on two traffic signs databases collected from domestic and Netherlands show the method's excellent effect on TSR. © 2011 The Journal of China Universities of Posts and Telecommunications.
引用
收藏
页码:12 / 16
页数:4
相关论文
共 50 条
  • [1] Automatic Recognition of Traffic Signs Based on Visual Inspection
    He, Shouhui
    Chen, Lei
    Zhang, Shaoyun
    Guo, Zhuangxian
    Sun, Pengjie
    Liu, Hong
    Liu, Hongda
    [J]. IEEE ACCESS, 2021, 9 : 43253 - 43261
  • [2] A Visual Attention based Method for Detecting Traffic Signs of Interest
    Yu, Yuanlong
    Gu, Zhaojie
    Liu, Huaping
    Gu, Jason
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION (ICIA), 2014, : 290 - 294
  • [3] Applying a visual attention mechanism to the problem of traffic sign recognition
    Rodrigues, FA
    Gomes, HM
    [J]. SIBGRAPI 2002: XV BRAZILIAN SYMPOSIUM ON COMPUTER GRAPHICS AND IMAGE PROCESSING, PROCEEDINGS, 2002, : 415 - 415
  • [4] Visual Recognition of Traffic Signs in Natural Scenes Based on Improved RetinaNet
    Liu, Shangwang
    Cai, Tongbo
    Tang, Xiufang
    Zhang, Yangyang
    Wang, Changgeng
    [J]. ENTROPY, 2022, 24 (01)
  • [5] A ship target recognition method based on biological visual attention mechanism
    Ma Xiao
    Chen Zhongwei
    Suo Jun
    Zhuansun Xiaobo
    Ni Jiazheng
    Zhang Shuai
    Liu Mo
    [J]. AOPC 2021: NOVEL TECHNOLOGIES AND INSTRUMENTS FOR ASTRONOMICAL MULTI-BAND OBSERVATIONS, 2021, 12069
  • [6] Automatic visual recognition for leaf disease based on enhanced attention mechanism
    Yao, Yumeng
    Deng, Xiaodun
    Zhang, Xu
    Li, Junming
    Sun, Wenxuan
    Zhang, Gechao
    [J]. PeerJ Computer Science, 2024, 10
  • [7] Microstructure recognition of steels by machine learning based on visual attention mechanism
    Xing-yu Chen
    Lin Cheng
    Cheng-yang Hu
    Yu-peng Zhang
    Kai-ming Wu
    [J]. Journal of Iron and Steel Research International, 2024, 31 : 909 - 923
  • [8] Microstructure recognition of steels by machine learning based on visual attention mechanism
    Chen, Xing-yu
    Cheng, Lin
    Hu, Cheng-yang
    Zhang, Yu-peng
    Wu, Kai-ming
    [J]. JOURNAL OF IRON AND STEEL RESEARCH INTERNATIONAL, 2024, 31 (04) : 909 - 923
  • [9] Recognition of traffic signs
    Aigerim, Sagandykova
    Meirambek, Zhaparov
    Askar, Zhakenov
    Yedilkhan, Amirgaliyev
    [J]. 2014 IEEE 8th International Conference on Application of Information and Communication Technologies (AICT), 2014, : 220 - 225
  • [10] Detection and Recognition of Traffic Signs Inside the Attentional Visual Field of Drivers
    Zabihi, S. J.
    Zabihi, S. M.
    Beauchemin, S. S.
    Bauer, M. A.
    [J]. 2017 28TH IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV 2017), 2017, : 583 - 588