Intention Recognition of Pedestrians and Cyclists by 2D Pose Estimation

被引:55
|
作者
Fang, Zhijie [1 ,2 ]
Lopez, Antonio M. [1 ,2 ]
机构
[1] Univ Autonoma Barcelona UAB, Dept Comp Sci, Barcelona 08193, Spain
[2] Univ Autonoma Barcelona UAB, Comp Vis Ctr CVC, Barcelona 08193, Spain
关键词
Autonomous driving; ADAS; computer vision; pedestrians intention recognition; cyclists intention recognition;
D O I
10.1109/TITS.2019.2946642
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Anticipating the intentions of vulnerable road users (VRUs) such as pedestrians and cyclists is critical for performing safe and comfortable driving maneuvers. This is the case for human driving and, thus, should be taken into account by systems providing any level of driving assistance, from advanced driver assistant systems (ADAS) to fully autonomous vehicles (AVs). In this paper, we show how the latest advances on monocular vision-based human pose estimation, i.e. those relying on deep Convolutional Neural Networks (CNNs), enable to recognize the intentions of such VRUs. In the case of cyclists, we assume that they follow traffic rules to indicate future maneuvers with arm signals. In the case of pedestrians, no indications can be assumed. Instead, we hypothesize that the walking pattern of a pedestrian allows to determine if he/she has the intention of crossing the road in the path of the ego-vehicle, so that the ego-vehicle must maneuver accordingly (e.g. slowing down or stopping). In this paper, we show how the same methodology can be used for recognizing pedestrians and cyclists' intentions. For pedestrians, we perform experiments on the JAAD dataset. For cyclists, we did not found an analogous dataset, thus, we created our own one by acquiring and annotating videos which we share with the research community. Overall, the proposed pipeline provides new state-of-the-art results on the intention recognition of VRUs.
引用
收藏
页码:4773 / 4783
页数:11
相关论文
共 50 条
  • [1] Pedestrians and Cyclists’ Intention Estimation for the Purpose of Autonomous Drivin
    Capy, Siméon
    Venture, Gentiane
    Raksincharoensak, Pongsathorn
    [J]. International Journal of Automotive Engineering, 2023, 14 (01) : 10 - 19
  • [2] 2D Action Recognition Serves 3D Human Pose Estimation
    Gall, Juergen
    Yao, Angela
    Van Gool, Luc
    [J]. COMPUTER VISION-ECCV 2010, PT III, 2010, 6313 : 425 - 438
  • [3] Action Recognition Algorithm based on 2D Human Pose Estimation Method
    Yu, Chongkai
    Chen, Wenjie
    Li, Ye
    Chen, Chen
    [J]. 2021 PROCEEDINGS OF THE 40TH CHINESE CONTROL CONFERENCE (CCC), 2021, : 7366 - 7370
  • [4] 3D Human Pose Estimation=2D Pose Estimation plus Matching
    Chen, Ching-Hang
    Ramanan, Deva
    [J]. 30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 5759 - 5767
  • [5] 2D Human pose estimation: a survey
    Haoming Chen
    Runyang Feng
    Sifan Wu
    Hao Xu
    Fengcheng Zhou
    Zhenguang Liu
    [J]. Multimedia Systems, 2023, 29 : 3115 - 3138
  • [6] 2D Human pose estimation: a survey
    Chen, Haoming
    Feng, Runyang
    Wu, Sifan
    Xu, Hao
    Zhou, Fengcheng
    Liu, Zhenguang
    [J]. MULTIMEDIA SYSTEMS, 2023, 29 (05) : 3115 - 3138
  • [7] Optimized convolutional pose machine for 2D hand pose estimation
    Pan, Tianhong
    Wang, Zheng
    Fan, Yuan
    [J]. JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2022, 83
  • [8] 2D/3D Pose Estimation and Action Recognition using Multitask Deep Learning
    Luvizon, Diogo C.
    Picard, David
    Tabia, Hedi
    [J]. 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 5137 - 5146
  • [9] 2D Methods for Pose Invariant Face Recognition
    Mokoena, Nthabiseng
    Tsague, Hippolyte Djonon
    Helberg, Albert
    [J]. 2016 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE & COMPUTATIONAL INTELLIGENCE (CSCI), 2016, : 841 - 846
  • [10] Pose Based Start Intention Detection of Cyclists
    Kress, Viktor
    Jung, Janis
    Zernetsch, Stefan
    Doll, Konrad
    Sick, Bernhard
    [J]. 2019 IEEE INTELLIGENT TRANSPORTATION SYSTEMS CONFERENCE (ITSC), 2019, : 2381 - 2386