Real Time Eye Gaze Tracking for Human Machine Interaction in the Cockpit

被引:1
|
作者
Turetkin, Engin [1 ]
Saeedi, Sareh [1 ]
Bigdeli, Siavash [1 ]
Stadelmann, Patrick [1 ]
Cantale, Nicolas [1 ]
Lutnyk, Luis [2 ]
Raubal, Martin [2 ]
Dunbar, L. Andrea [1 ]
机构
[1] CSEM SA, Edge AI & Vis Grp, Rue Jaquet Droz 1, CH-2002 Neuchatel, Switzerland
[2] Swiss Fed Inst Technol, Inst Cartog & Geoinformat, Stefano Franscini Pl 5, CH-8093 Zurich, Switzerland
来源
基金
欧盟地平线“2020”;
关键词
Gaze-based interaction; Eye gaze detection; Aviation; Computer vision; Machine learning; Human-machine interaction;
D O I
10.1117/12.2607434
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Aeronautics industry has pioneered safety from digital checklists to moving maps that improve pilot situational awareness and support safe ground movements. Today, pilots deal with increasingly complex cockpit environments and air traffic densification. Here we present an intelligent vision system, which allows real-time human-machine interaction in the cockpits to reduce pilot's workload. The challenges for such a vision system include extreme change in background light intensity, large field-of-view and variable working distances. Adapted hardware, use of state-of-the-art computer vision techniques and machine learning algorithms in eye gaze detection allow a smooth, and accurate real-time feedback system. The current system has been over-specified to explore the optimized solutions for different use-cases. The algorithmic pipeline for eye gaze tracking was developed and iteratively optimized to obtain the speed and accuracy required for the aviation use cases. The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants' eyes. The angular gaze deviation goes down to less than 1 degrees for the panels towards which an accurate eye gaze was required according to the use cases.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Persistent Human-Machine Interfaces for Robotic Arm Control Via Gaze and Eye Direction Tracking
    Ban, Seunghyeb
    Lee, Yoon Jae
    Yu, Ki Jun
    Chang, Jae Won
    Kim, Jong-Hoon
    Yeo, Woon-Hong
    [J]. ADVANCED INTELLIGENT SYSTEMS, 2023, 5 (07)
  • [42] GazeIn'13 - The 6th Workshop on Eye Gaze in Intelligent Human Machine Interaction
    Bednarik, Roman
    Huang, Hung-Hsuan
    Nakano, Yukiko
    Jokinen, Kristiina
    [J]. ICMI'13: PROCEEDINGS OF THE 2013 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2013, : 609 - 610
  • [43] CapRe: a gaze tracking system in man-machine interaction
    Collet, C
    Finkel, A
    Gherbi, R
    [J]. INES'97 : 1997 IEEE INTERNATIONAL CONFERENCE ON INTELLIGENT ENGINEERING SYSTEMS, PROCEEDINGS, 1997, : 577 - 581
  • [44] A calibrated, real-time eye gaze tracking system as an assistive system for persons with motor disability
    Sesin, A
    Adjouadi, M
    Ayala, M
    [J]. 7TH WORLD MULTICONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL VI, PROCEEDINGS: INFORMATION SYSTEMS, TECHNOLOGIES AND APPLICATIONS: I, 2003, : 399 - 404
  • [45] Collaborative eye tracking based code review through real-time shared gaze visualization
    Shiwei Cheng
    Jialing Wang
    Xiaoquan Shen
    Yijian Chen
    Anind Dey
    [J]. Frontiers of Computer Science, 2022, 16
  • [46] Subpixel eye gaze tracking
    Zhu, J
    Yang, J
    [J]. FIFTH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, PROCEEDINGS, 2002, : 131 - 136
  • [47] The evaluation of eye gaze using an eye tracking system in simulation training of real-time ultrasound-guided venipuncture
    Tatsuru, Kaji
    Keisuke, Yano
    Shun, Onishi
    Mayu, Matsui
    Ayaka, Nagano
    Masakazu, Murakami
    Koshiro, Sugita
    Toshio, Harumatsu
    Koji, Yamada
    Waka, Yamada
    Makoto, Matsukubo
    Mitsuru, Muto
    Kazuhiko, Nakame
    Satoshi, Ieiri
    [J]. JOURNAL OF VASCULAR ACCESS, 2022, 23 (03): : 360 - 364
  • [48] Eye Tracking and Gaze Based Interaction within Immersive Virtual Environments
    Haffegee, Adrian
    Barrow, Russell
    [J]. COMPUTATIONAL SCIENCE - ICCS 2009, 2009, 5545 : 729 - 736
  • [49] Eye Tracking for Human Robot Interaction
    Palinko, Oskar
    Rea, Francesco
    Sandini, Giulio
    Sciutti, Alessandra
    [J]. 2016 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2016), 2016, : 327 - 328
  • [50] Real-Time Eye-Interaction System Developed with Eye Tracking Glasses and Motion Capture
    Bao, Haifeng
    Fang, Weining
    Guo, Beiyuan
    Wang, Peng
    [J]. ADVANCES IN HUMAN FACTORS IN WEARABLE TECHNOLOGIES AND GAME DESIGN (AHFE 2017), 2018, 608 : 72 - 81