End-to-End Velocity Estimation for Autonomous Racing

被引:21
|
作者
Srinivasan, Sirish [1 ]
Sa, Inkyu [2 ]
Zyner, Alex [1 ]
Reijgwart, Victor [1 ]
Valls, Miguel I. [3 ]
Siegwart, Roland [1 ]
机构
[1] Swiss Fed Inst Technol, Autonomous Syst Lab, CH-8092 Zurich, Switzerland
[2] CSIRO, Robot & Autonomous Syst Grp, Pullenvale, Qld 4069, Australia
[3] Sevensense Robot AG, CH-8006 Zurich, Switzerland
关键词
Field robots; autonomous vehicle navigation; sensor fusion; STATE ESTIMATION;
D O I
10.1109/LRA.2020.3016929
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Velocity estimation plays a central role in driverless vehicles, but standard, and affordable methods struggle to cope with extreme scenarios like aggressive maneuvers due to the presence of high sideslip. To solve this, autonomous race cars are usually equipped with expensive external velocity sensors. In this letter, we present an end-to-end recurrent neural network that takes available raw sensors as input (IMU, wheel odometry, and motor currents), and outputs velocity estimates. The results are compared to two state-of-the-art Kalman filters, which respectively include, and exclude expensive velocity sensors. All methods have been extensively tested on a formula student driverless race car with very high sideslip (10 degrees at the rear axle), and slip ratio (approximate to 20%), operating close to the limits of handling. The proposed network is able to estimate lateral velocity up to 15x better than the Kalman filter with the equivalent sensor input, and matches (0.06 m/s RMSE) the Kalman filter with the expensive velocity sensor setup.
引用
收藏
页码:6869 / 6875
页数:7
相关论文
共 50 条
  • [1] Towards Safety Assured End-to-End Vision-Based Control for Autonomous Racing
    Kalaria, Dvij
    Lin, Qin
    Dolan, John M.
    IFAC PAPERSONLINE, 2023, 56 (02): : 2767 - 2773
  • [2] Intermediate Tasks Enhanced End-to-End Autonomous Driving with Uncertainty Estimation
    Huang, Xuean
    Su, Jianmei
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 133 - 138
  • [3] Multimodal End-to-End Autonomous Driving
    Xiao, Yi
    Codevilla, Felipe
    Gurram, Akhil
    Urfalioglu, Onay
    Lopez, Antonio M.
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (01) : 537 - 547
  • [4] An End-to-End Stochastic Action and Visual Estimation System Towards Autonomous Teleoperation
    Akay, Abdullah
    Akgul, Yusuf Sinan
    IEEE ACCESS, 2022, 10 : 16700 - 16719
  • [5] GenAD: Generative End-to-End Autonomous Driving
    Zheng, Wenzhao
    Song, Ruiqi
    Guo, Xianda
    Zhan, Chenming
    Chen, Long
    COMPUTER VISION - ECCV 2024, PT LXV, 2025, 15123 : 87 - 104
  • [6] End-to-end Autonomous Driving: Advancements and Challenges
    Chu, Duan-Feng
    Wang, Ru-Kang
    Wang, Jing-Yi
    Hua, Qiao-Zhi
    Lu, Li-Ping
    Wu, Chao-Zhong
    Zhongguo Gonglu Xuebao/China Journal of Highway and Transport, 2024, 37 (10): : 209 - 232
  • [7] End-to-End Autonomous Driving in CARLA: A Survey
    Al Ozaibi, Youssef
    Hina, Manolo Dulva
    Ramdane-Cherif, Amar
    IEEE ACCESS, 2024, 12 : 146866 - 146900
  • [8] End-to-End Autonomous Driving: Challenges and Frontiers
    Chen, Li
    Wu, Penghao
    Chitta, Kashyap
    Jaeger, Bernhard
    Geiger, Andreas
    Li, Hongyang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (12) : 10164 - 10183
  • [9] PillarFlow End-to-end Birds-eye-view Flow Estimation for Autonomous Driving
    Lee, Kuan-Hui
    Kliemann, Matthew
    Gaidon, Adrien
    Fang, Jie Chao
    Pillai, Sudeep
    Burgard, Wolfram
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 2007 - 2013
  • [10] Balanced Training for the End-to-End Autonomous Driving Model Based on Kernel Density Estimation
    Yao, Tong
    Yuan, Wei
    Zhang, Songan
    Wang, Chunxiang
    2024 35TH IEEE INTELLIGENT VEHICLES SYMPOSIUM, IEEE IV 2024, 2024, : 2361 - 2366