Robust Identification of Road Surface Condition Based on Ego-Vehicle Trajectory Reckoning

被引:0
|
作者
Cheng Tian
Bo Leng
Xinchen Hou
Yuyao Huang
Wenrui Zhao
Da Jin
Lu Xiong
Junqiao Zhao
机构
[1] Tongji University,School of Automotive Studies
[2] Tongji University,College of Electronic and Information Engineering
来源
Automotive Innovation | 2022年 / 5卷
关键词
Road surface identification; Ego-Vehicle trajectory reckoning; Multi-task learning; Dempster-Shafer evidence theory; Autonomous vehicle;
D O I
暂无
中图分类号
学科分类号
摘要
The type of road surface condition (RSC) will directly affect the driving performance of vehicles. Monitoring the type of RSC is essential for both transportation agencies and individual drivers. However, most existing methods are solely based on a dynamics-based method or an image-based method, which is susceptible to road excitation limitations and interference from the external environment. Therefore, this paper proposes a decision-level fusion identification framework of the RSC based on ego-vehicle trajectory reckoning to accurately obtain the type of RSC that the front wheels of the vehicle will experience. First, a road feature extraction model based on multi-task learning is conducted, which can simultaneously segment the drivable area and road cast shadow. Second, the optimized candidate regions of interest are classified with confidence levels by ShuffleNet. Considering environmental interference, candidate regions of interest regarded as virtual sensors are fused by improved Dempster-Shafer evidence theory to obtain the fusion results. Finally, the ego-vehicle trajectory reckoning module based on the kinematic bicycle model is added to the proposed fusion method to extract the RSC experienced by the front wheels. The performance of the entire framework is verified on a specific dataset with shadow and split curve roads. The results reveal that the proposed method can identify the RSC with accurate predictions in real time.
引用
收藏
页码:376 / 387
页数:11
相关论文
共 50 条
  • [1] Robust Identification of Road Surface Condition Based on Ego-Vehicle Trajectory Reckoning
    Tian, Cheng
    Leng, Bo
    Hou, Xinchen
    Huang, Yuyao
    Zhao, Wenrui
    Jin, Da
    Xiong, Lu
    Zhao, Junqiao
    [J]. AUTOMOTIVE INNOVATION, 2022, 5 (04) : 376 - 387
  • [2] Pedestrian and Ego-vehicle Trajectory Prediction from Monocular Camera
    Neumann, Lukas
    Vedaldi, Andrea
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 10199 - 10207
  • [3] Classifying Ego-Vehicle Road Maneuvers from Dashcam Video
    Zekany, Stephen A.
    Dreslinski, Ronald G.
    Wenisch, Thomas F.
    [J]. 2019 IEEE INTELLIGENT TRANSPORTATION SYSTEMS CONFERENCE (ITSC), 2019, : 1204 - 1210
  • [4] Lane Identification and Ego-Vehicle Accurate Global Positioning in Intersections
    Popescu, Voichita
    Bace, Mihai
    Nedevschi, Sergiu
    [J]. 2011 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), 2011, : 870 - 875
  • [5] Persistent Homology in LiDAR-Based Ego-Vehicle Localization
    Akai, Naoki
    Hirayama, Takatsugu
    Murase, Hiroshi
    [J]. 2021 32ND IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), 2021, : 889 - 896
  • [6] Ego-Vehicle Corridors for Vision-Based Driver Assistance
    Jiang, Ruyi
    Klette, Reinhard
    Vaudrey, Tobi
    Wang, Shigang
    [J]. COMBINATORIAL IMAGE ANALYSIS, PROCEEDINGS, 2009, 5852 : 238 - +
  • [8] See the Future: A Semantic Segmentation Network Predicting Ego-Vehicle Trajectory With a Single Monocular Camera
    Sun, Yuxiang
    Zuo, Weixun
    Liu, Ming
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (02): : 3066 - 3073
  • [9] Ego-Vehicle Action Recognition based on Semi-Supervised Contrastive Learning
    Noguchi, Chihiro
    Tanizawa, Toshihiro
    [J]. 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 5977 - 5987
  • [10] Learning to Predict Ego-Vehicle Poses for Sampling-Based Nonholonomic Motion Planning
    Banzhaf, Holger
    Sanzenbacher, Paul
    Baumann, Ulrich
    Zoellner, J. Marius
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2019, 4 (02) : 1053 - 1060