An Accurate and Fast Method to Inspect Rail Wear Based on Revised Global Registration

被引:21
|
作者
Yang, Yue [1 ]
Liu, Long [1 ]
Yi, Bing [1 ,2 ]
Chen, Feng [1 ]
机构
[1] Cent S Univ, Sch Traff & Transportat Engn, Changsha 410075, Hunan, Peoples R China
[2] Univ Michigan, Dept Mech Engn, Ann Arbor, MI 48105 USA
来源
IEEE ACCESS | 2018年 / 6卷
基金
中国国家自然科学基金;
关键词
Laser profile sensor; rail wear inspection; RFGR; Hausdorff distance; GO-ICP;
D O I
10.1109/ACCESS.2018.2873903
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Rail wear inspection is vitally important in the railway industry. Conventional methods mainly use manual or static measurements to detect rail wear, which are inefficient, imprecise, and unreliable. To improve the accuracy and efficiency of rail wear inspection, a dynamic detection system based on a revised fast global registration (RFGR) algorithm was employed. First, the framework for online detection of rail wear with multi-profile was put forward to reduce the influence of vibrations of individual sections. Second, the RFGR method was proposed by using a robust weight function to convert the non-convex registration model to a convex problem, and the Levenberg-Marquardt method was used to solve nonlinear least-squares systems robustly. Finally, the Hausdorff distance was introduced to visualize the distance between the wear profile and the reference profile after alignment. The experimental results demonstrated that the RFGR algorithm was more accurate, robust, and effective than iterative closest point (ICP), sparse ICP, Yi's sparse ICP, and the fast global registration algorithm. For actual wear detection, the proposed method was more efficient and robust for the online dynamic detection of rail wear when compared with the single-profile-section-based inspection method.
引用
收藏
页码:57267 / 57278
页数:12
相关论文
共 50 条
  • [1] A Fast and Accurate Planar-Feature-Based Global Scan Registration Method
    Zong, Wenpeng
    Li, Minglei
    Zhou, Yanglin
    Wang, Li
    Xiang, Fengzhuo
    Li, Guangyun
    [J]. IEEE SENSORS JOURNAL, 2019, 19 (24) : 12333 - 12345
  • [2] Simple and fast rail wear measurement method based on structured light
    Liu, Zhen
    Sun, Junhua
    Wang, Heng
    Zhang, Guangjun
    [J]. OPTICS AND LASERS IN ENGINEERING, 2011, 49 (11) : 1343 - 1351
  • [3] Third Rail Wear Inspection Based on Point Cloud Registration
    Xin, Lim Jia
    [J]. 2021 7TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND ROBOTICS (ICCAR), 2021, : 290 - 294
  • [4] A fast and accurate method for registration of MR images of the head
    Kotsas, P
    Malasiotis, S
    Strintzis, M
    Piraino, DW
    Cornhill, JF
    [J]. INTERNATIONAL JOURNAL OF MEDICAL INFORMATICS, 1998, 52 (1-3) : 167 - 182
  • [5] Fast and accurate nonlinear spectral method for image recognition and registration
    Costa, Luciano da Fontoura
    Bollt, Erik
    [J]. APPLIED PHYSICS LETTERS, 2006, 89 (17)
  • [6] Real-time and accurate rail wear measurement method and experimental analysis
    Liu, Zhen
    Li, Fengjiao
    Huang, Bangkui
    Zhang, Guangjun
    [J]. JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 2014, 31 (08) : 1721 - 1729
  • [7] An Accurate Registration Method Based on Global Mixed Structure Similarity (GMSIM) for Remote Sensing Images
    Yang, Han
    Li, Xiaorun
    Chen, Shuhan
    Zhao, Liaoying
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [8] Rail wear measuring method based on dynamic template
    Zheng, Shubin
    Chai, Xiaodong
    An, Xiaoxue
    Li, Liming
    [J]. Zhongguo Tiedao Kexue/China Railway Science, 2013, 34 (02): : 7 - 12
  • [9] A Fast Registration Method Based on Line Features
    Zhang, Meng
    Yang, Yi
    Jiang, Qinghua
    Zhang, Sixian
    [J]. PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 2918 - 2923
  • [10] Predicting the wear on the non-working side of the rail profile registration method and its validation
    Nie, Liangtao
    Li, Weimin
    Qu, Xiangyu
    Jing, Guoqing
    Liu, Xiubo
    Feng, Xiaomeng
    [J]. MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (11)