Real Time Eye Gaze Tracking for Human Machine Interaction in the Cockpit

被引:1
|
作者
Turetkin, Engin [1 ]
Saeedi, Sareh [1 ]
Bigdeli, Siavash [1 ]
Stadelmann, Patrick [1 ]
Cantale, Nicolas [1 ]
Lutnyk, Luis [2 ]
Raubal, Martin [2 ]
Dunbar, L. Andrea [1 ]
机构
[1] CSEM SA, Edge AI & Vis Grp, Rue Jaquet Droz 1, CH-2002 Neuchatel, Switzerland
[2] Swiss Fed Inst Technol, Inst Cartog & Geoinformat, Stefano Franscini Pl 5, CH-8093 Zurich, Switzerland
来源
基金
欧盟地平线“2020”;
关键词
Gaze-based interaction; Eye gaze detection; Aviation; Computer vision; Machine learning; Human-machine interaction;
D O I
10.1117/12.2607434
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Aeronautics industry has pioneered safety from digital checklists to moving maps that improve pilot situational awareness and support safe ground movements. Today, pilots deal with increasingly complex cockpit environments and air traffic densification. Here we present an intelligent vision system, which allows real-time human-machine interaction in the cockpits to reduce pilot's workload. The challenges for such a vision system include extreme change in background light intensity, large field-of-view and variable working distances. Adapted hardware, use of state-of-the-art computer vision techniques and machine learning algorithms in eye gaze detection allow a smooth, and accurate real-time feedback system. The current system has been over-specified to explore the optimized solutions for different use-cases. The algorithmic pipeline for eye gaze tracking was developed and iteratively optimized to obtain the speed and accuracy required for the aviation use cases. The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants' eyes. The angular gaze deviation goes down to less than 1 degrees for the panels towards which an accurate eye gaze was required according to the use cases.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Intelligent cockpit: eye tracking integration to enhance the pilot-aircraft interaction
    Lounis, Christophe
    Peysakhovich, Vsevolod
    Causse, Mickael
    [J]. 2018 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2018), 2018,
  • [32] Neurointerfaces for human-machine real time interaction
    Widrow, B
    Ferreira, EP
    Lamego, MM
    [J]. ALGORITHMS AND ARCHITECTURES FOR REAL-TIME CONTROL 1998 (AARTC'98), 1998, : 101 - 106
  • [33] Real Time Eye Localization and Tracking
    Poulopoulos, Nikolaos
    Psarakis, Emmanouil Z.
    [J]. ADVANCES IN SERVICE AND INDUSTRIAL ROBOTICS, RAAD 2018, 2019, 67 : 560 - 571
  • [34] Real Time Learning Evaluation Based on Gaze Tracking
    Yi, Jiayue
    Sheng, Bin
    Shen, Ruimin
    Lin, Weiyao
    Wu, Enhua
    [J]. 2015 14TH INTERNATIONAL CONFERENCE ON COMPUTER-AIDED DESIGN AND COMPUTER GRAPHICS (CAD/GRAPHICS), 2015, : 157 - 164
  • [35] Collaborative eye tracking based code review through real-time shared gaze visualization
    Shiwei CHENG
    Jialing WANG
    Xiaoquan SHEN
    Yijian CHEN
    Anind DEY
    [J]. Frontiers of Computer Science., 2022, 16 (03) - 174
  • [36] Optimal Iris Region Matching and Gaze Point Calibration for Real-Time Eye Tracking Systems
    Kao, Wen-Chung
    Lin, Chun-Yi
    Hsu, Chen-Chien
    Lee, Chia-Yi
    Ke, Bai-Yueh
    Su, Ting-Yi
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE), 2016,
  • [37] Collaborative eye tracking based code review through real-time shared gaze visualization
    Cheng, Shiwei
    Wang, Jialing
    Shen, Xiaoquan
    Chen, Yijian
    Dey, Anind
    [J]. FRONTIERS OF COMPUTER SCIENCE, 2022, 16 (03)
  • [38] Real-Time Gaze Tracking for Public Displays
    Sippl, Andreas
    Holzmann, Clemens
    Zachhuber, Doris
    Ferscha, Alois
    [J]. AMBIENT INTELLIGENCE, 2010, 6439 : 167 - +
  • [39] Real-time Gaze Tracking with Head-eye Coordination for Head-mounted Displays
    Chen, Lingling
    Li, Yingxi
    Bai, Xiaowei
    Wang, Xiaodong
    Hu, Yongqiang
    Song, Mingwu
    Xie, Liang
    Yan, Ye
    Yin, Erwei
    [J]. 2022 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR 2022), 2022, : 82 - 91
  • [40] Toward graphene textiles in wearable eye tracking systems for human–machine interaction
    Golparvar, Ata Jedari
    Yapici, Murat Kaya
    [J]. Beilstein Journal of Nanotechnology, 2021, 12 : 180 - 189