Multimodal Sensing and Computational Intelligence for Situation Awareness Classification in Autonomous Driving

被引:12
|
作者
Yang, Jing [1 ]
Liang, Nade [1 ]
Pitts, Brandon J. [1 ]
Prakah-Asante, Kwaku O. [2 ]
Curry, Reates [2 ]
Blommer, Mike [2 ]
Swaminathan, Radhakrishnan [2 ]
Yu, Denny [1 ]
机构
[1] Purdue Univ, Sch Ind Engn, W Lafayette, IN 47906 USA
[2] Ford Motor Co, Dearborn, MI 48126 USA
关键词
Task analysis; Electroencephalography; Vehicles; Brain modeling; Particle measurements; Atmospheric measurements; Physiology; Autonomous vehicle; electroencephalogram (EEG); eye-tracking; machine learning (ML); situation awareness measurement; EEG; ATTENTION; OSCILLATIONS; PERFORMANCE; DEEP; LOOP;
D O I
10.1109/THMS.2023.3234429
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Maintaining situation awareness (SA) is essential for drivers to deal with the situations that Society of Automotive Engineers (SAE) Level 3 automated vehicle systems are not designed to handle. Although advanced physiological sensors can enable continuous SA assessments, previous single-modality approaches may not be sufficient to capture SA. To address this limitation, the current study demonstrates a multimodal sensing approach for objective SA monitoring. Physiological sensor data from electroencephalogram and eye-tracking were recorded for 30 participants as they performed three secondary tasks during automated driving scenarios that consisted of a pre-takeover (pre-TOR) request segment and a post-TOR segment. The tasks varied in terms of how visual attention was allocated in the pre-TOR segment. In the post-TOR segment, drivers were expected to gather information from the driving environment in preparation for a vehicle-to-driver transition. Participants' ground-truth SA level was measured using the Situation Awareness Global Assessment Techniques (SAGAT) after the post-TOR segment. A total of 23 physiological features were extracted from the post-TOR segment to train computational intelligence models. Results compared the performance of five different classifiers, the ground-truth labeling strategies, and the features included in the model. Overall, the proposed neural network model outperformed other machine learning models and achieved the best classification accuracy (90.6%). A model with 11 features was optimal. In addition, the multi-physiological sensor-model outperformed the single sensing model by comparing prediction performance. Our results suggest that multimodal sensing model can objectively predict SA. The results of this study provide new insight into how physiological features contribute to the SA assessment.
引用
收藏
页码:270 / 281
页数:12
相关论文
共 50 条
  • [21] From Distraction to Action: Elevating Situation Awareness with Visual Assistance in Level 3 Autonomous Driving
    Zhu, Yancong
    Li, Chengyu
    Qiao, Zixuan
    Qu, Rong
    Wang, Yu
    Xiong, Jiaqing
    Liu, Wei
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2025, 41 (07) : 4271 - 4283
  • [22] A Multimodal Vision Sensor for Autonomous Driving
    Sun, Dongming
    Huang, Xiao
    Yang, Kailun
    COUNTERTERRORISM, CRIME FIGHTING, FORENSICS, AND SURVEILLANCE TECHNOLOGIES III, 2019, 11166
  • [23] Robobo SmartCity: An Autonomous Driving Model for Computational Intelligence Learning Through Educational Robotics
    Naya-Varela, Martin
    Guerreiro-Santalla, Sara
    Baamonde, Tamara
    Bellas, Francisco
    IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES, 2023, 16 (04): : 543 - 559
  • [24] RSAW: A Situation Awareness System for Autonomous Robots
    Ben Ghezala, Mohamed Walid
    Bouzeghoub, Amel
    Leroux, Christophe
    2014 13TH INTERNATIONAL CONFERENCE ON CONTROL AUTOMATION ROBOTICS & VISION (ICARCV), 2014, : 450 - 455
  • [25] Situation Awareness: The Key to Safe Autonomous Systems
    Haidegger, Tamas
    IEEE JOINT 19TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND INFORMATICS AND 7TH INTERNATIONAL CONFERENCE ON RECENT ACHIEVEMENTS IN MECHATRONICS, AUTOMATION, COMPUTER SCIENCES AND ROBOTICS (CINTI-MACRO 2019), 2019, : 21 - 22
  • [26] Toward Measurement of Situation Awareness in Autonomous Vehicles
    Sirkin, David
    Martelaro, Nikolas
    Johns, Mishel
    Ju, Wendy
    PROCEEDINGS OF THE 2017 ACM SIGCHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'17), 2017, : 405 - 415
  • [27] Investigating impact of situation awareness-based displays of semi-autonomous driving in urgent situations
    Kim, Hwiseong
    Hong, Jeonguk
    Lee, Sangwon
    TRANSPORTATION RESEARCH PART F-TRAFFIC PSYCHOLOGY AND BEHAVIOUR, 2024, 105 : 454 - 472
  • [28] A measurement to driving situation awareness in signalized intersections
    Mao, Yan
    Wang, Wuhong
    Ding, Chenxi
    Guo, Weiwei
    Jiang, Xiaobei
    Baumann, Martin
    Wets, Geert
    TRANSPORTATION RESEARCH PART D-TRANSPORT AND ENVIRONMENT, 2018, 62 : 739 - 747
  • [29] Towards artificial situation awareness by autonomous vehicles
    McAree, Owen
    Aitken, Jonathan M.
    Veres, Sandor M.
    IFAC PAPERSONLINE, 2017, 50 (01): : 7038 - 7043
  • [30] Driving experience and situation awareness in hazard detection
    Underwood, Geoffrey
    Athy Ngai
    Underwood, Jean
    SAFETY SCIENCE, 2013, 56 : 29 - 35