Determining the onset of driver's preparatory action for take-over in automated driving using multimodal data

被引:1
|
作者
Teshima, Takaaki [1 ]
Niitsuma, Masahiro [1 ]
Nishimura, Hidekazu [1 ]
机构
[1] Keio Univ, Grad Sch Syst Design & Management, Collaborat Complex,4-1-1 Hiyoshi,Kohoku Ku, Yokohama 2238526, Japan
关键词
Preparatory action; Take-over; Automated driving; Multimodal data fusion; Change-point detection; HUMAN ACTIVITY RECOGNITION; MODEL; TIME; SITUATIONS; DROWSINESS; ATTENTION; VEHICLES; SYSTEM; TASKS;
D O I
10.1016/j.eswa.2024.123153
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Automated driving technology has the potential to substantially reduce traffic accidents, a considerable portion of which are caused by human error. Nonetheless, until automated driving systems reach Level 5, which can drive automatically under all road conditions, there will be situations requiring driver intervention. In these situations, drivers engage in actions to prepare for take-over that include shifting their visual attention to the road, placing their hands on the steering wheel, and placing their feet on the pedals. Proper execution of preparatory actions is critical for a safe take-over, and it is crucial to analyze and verify that the actions are properly initiated during the take-over situations. However, when analyzing or verifying preparatory actions for a take-over, manual observation based on video footage is necessary to capture the actions. This manual observation can become a laborious task. Therefore, we propose a method to automatically determine the onset of a driver's preparatory action for a take-over. This method provides a binary signal that indicates the onset of the action, and the signal could serve as an informative marker. For example, the timing of the signal can be used to verify whether a developing Human Machine Interface (HMI) effectively prompts the driver to initiate a preparatory action within the expected time frame. The method utilizes a multimodal fusion model to classify preparatory actions based on driver's upper-body video, seat pressure, and eye potential at the temples. Subsequently, the onset of the preparatory action is determined using a change-point detection technique on the time series of the predicted probabilities resulting from the classification of the preparatory actions. We created a dataset of 300 take-over events collected from 30 subjects and evaluated the method using a 5-fold cross-validation approach. The results demonstrate that the method can classify preparatory actions with an accuracy of 93.9%, and determine the actions' onset with a time error of 0.15 s.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Assisted Partial Take-Over in Conditionally Automated Driving: A User Study
    Gruden, Timotej
    Sodnik, Jaka
    Jakus, Grega
    IEEE ACCESS, 2023, 11 : 107940 - 107957
  • [22] Assisting Drivers with Ambient Take-Over Requests in Highly Automated Driving
    Borojeni, Shadan Sadeghian
    Chuang, Lewis
    Heuten, Wilko
    Boll, Susanne
    8TH INTERNATIONAL CONFERENCE ON AUTOMOTIVE USER INTERFACES AND INTERACTIVE VEHICULAR APPLICATIONS (AUTOMOTIVEUI 2016), 2016, : 237 - 244
  • [23] Improving Take-Over Quality in Automated Driving By Interrupting Non-Driving Tasks
    Koehn, Thomas
    Gottlieb, Matthias
    Schermann, Michael
    Krcmar, Helmut
    PROCEEDINGS OF IUI 2019, 2019, : 510 - 517
  • [24] Take-over expectation and criticality in Level 3 automated driving: a test track study on take-over behavior in semi-trucks
    Lotz, Alexander
    Russwinkel, Nele
    Wohlfarth, Enrico
    COGNITION TECHNOLOGY & WORK, 2020, 22 (04) : 733 - 744
  • [25] Take-over expectation and criticality in Level 3 automated driving: a test track study on take-over behavior in semi-trucks
    Alexander Lotz
    Nele Russwinkel
    Enrico Wohlfarth
    Cognition, Technology & Work, 2020, 22 : 733 - 744
  • [26] Take-Over Requests Analysis in Conditional Automated Driving and Driver Visual Research Under Encountering Road Hazard of Highway
    You, Fang
    Wang, Yujia
    Wang, Jianmin
    Zhu, Xichan
    Hansen, Preben
    ADVANCES IN HUMAN FACTORS AND SYSTEMS INTERACTION, 2018, 592 : 230 - 240
  • [27] Take-over again: Investigating multimodal and directional TORs to get the driver back into the loop
    Petermeijer, Sebastiaan
    Bazilinskyy, Pavlo
    Bengler, Klaus
    de Winter, Joost
    APPLIED ERGONOMICS, 2017, 62 : 204 - 215
  • [28] Experimental Design for Multi-modal Take-over Request for Automated Driving
    Yun, Hanna
    Lee, Ji Won
    Yang, Hee Dong
    Yang, Ji Hyun
    HCI INTERNATIONAL 2018 - POSTERS' EXTENDED ABSTRACTS, PT III, 2018, 852 : 418 - 425
  • [29] Effect of different alcohol levels on take-over performance in conditionally automated driving
    Wiedemann, Katharina
    Naujoks, Frederik
    Woerle, Johanna
    Kenntner-Mabiala, Ramona
    Kaussner, Yvonne
    Neukum, Alexandra
    ACCIDENT ANALYSIS AND PREVENTION, 2018, 115 : 89 - 97
  • [30] Effects of Visual Planning Support on Take-over Performance in Conditional Automated Driving
    Wang, Linyan
    Yu, Bo
    Zhang, Huijun
    Hu, Hongyu
    2022 IEEE 25TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2022, : 3219 - 3224