Real Time Eye Gaze Tracking for Human Machine Interaction in the Cockpit

被引:1
|
作者
Turetkin, Engin [1 ]
Saeedi, Sareh [1 ]
Bigdeli, Siavash [1 ]
Stadelmann, Patrick [1 ]
Cantale, Nicolas [1 ]
Lutnyk, Luis [2 ]
Raubal, Martin [2 ]
Dunbar, L. Andrea [1 ]
机构
[1] CSEM SA, Edge AI & Vis Grp, Rue Jaquet Droz 1, CH-2002 Neuchatel, Switzerland
[2] Swiss Fed Inst Technol, Inst Cartog & Geoinformat, Stefano Franscini Pl 5, CH-8093 Zurich, Switzerland
来源
基金
欧盟地平线“2020”;
关键词
Gaze-based interaction; Eye gaze detection; Aviation; Computer vision; Machine learning; Human-machine interaction;
D O I
10.1117/12.2607434
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Aeronautics industry has pioneered safety from digital checklists to moving maps that improve pilot situational awareness and support safe ground movements. Today, pilots deal with increasingly complex cockpit environments and air traffic densification. Here we present an intelligent vision system, which allows real-time human-machine interaction in the cockpits to reduce pilot's workload. The challenges for such a vision system include extreme change in background light intensity, large field-of-view and variable working distances. Adapted hardware, use of state-of-the-art computer vision techniques and machine learning algorithms in eye gaze detection allow a smooth, and accurate real-time feedback system. The current system has been over-specified to explore the optimized solutions for different use-cases. The algorithmic pipeline for eye gaze tracking was developed and iteratively optimized to obtain the speed and accuracy required for the aviation use cases. The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants' eyes. The angular gaze deviation goes down to less than 1 degrees for the panels towards which an accurate eye gaze was required according to the use cases.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Eye moving behaviors identification for gaze tracking interaction
    Qijie Zhao
    Xinming Yuan
    Dawei Tu
    Jianxia Lu
    [J]. Journal on Multimodal User Interfaces, 2015, 9 : 89 - 104
  • [22] Real Time Eye Gaze Tracking System using CNN-based Facial Features for Human Attention Measurement
    Lorenz, Oliver
    Thomas, Ulrike
    [J]. PROCEEDINGS OF THE 14TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS (VISAPP), VOL 5, 2019, : 598 - 606
  • [23] A non-contact eye-gaze tracking system for human computer interaction
    Qi, Ying
    Wang, Zhi-Liang
    Huang, Ying
    [J]. 2007 INTERNATIONAL CONFERENCE ON WAVELET ANALYSIS AND PATTERN RECOGNITION, VOLS 1-4, PROCEEDINGS, 2007, : 68 - 72
  • [24] Towards the Use of Eye Gaze Tracking Technology: Human Computer Interaction (HCI) Research
    Onyemauche, U. Chinyere
    Nkwo, Makuochi S.
    Charity, Mbanusi E.
    Nwosu-John, Q. Ngozi
    [J]. 3RD AFRICAN CONFERENCE ON HUMAN-COMPUTER INTERACTION, AFRICHI 2021, 2021, : 151 - 157
  • [25] HUMAN COMPUTER INTERACTION BY EYE BLINKING ON REAL TIME
    Aksu, Dogukan
    Aydin, M. Ali
    [J]. 2017 9TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND COMMUNICATION NETWORKS (CICN), 2017, : 135 - 138
  • [26] VHDL described Finger Tracking System for Real-Time Human-Machine Interaction
    Iturbe, Xabier
    Altuna, Andoni
    Ruiz de Olano, Alberto
    Martinez, Imanol
    [J]. ICSES 2008 INTERNATIONAL CONFERENCE ON SIGNALS AND ELECTRONIC SYSTEMS, CONFERENCE PROCEEDINGS, 2008, : 171 - 176
  • [27] Eye Gaze Tracking
    Nguyen, Ba Linh
    [J]. 2009 IEEE-RIVF INTERNATIONAL CONFERENCE ON COMPUTING AND COMMUNICATION TECHNOLOGIES: RESEARCH, INNOVATION AND VISION FOR THE FUTURE, 2009, : 144 - 147
  • [28] Finger Tracking In Real Time Human Computer Interaction
    Vijitha, T.
    Kumari, J. Pushpa
    [J]. INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2014, 14 (01): : 83 - 93
  • [29] Real-time motorized electrical hospital bed control with eye-gaze tracking
    Aydin Atasoy, Nesrin
    Cavusoglu, Abdullah
    Atasoy, Ferhat
    [J]. TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2016, 24 (06) : 5162 - +
  • [30] Best low-cost methods for real-time detection of the eye and gaze tracking
    [J]. Khaleel, Amal Hameed (amal_albahrany@yahoo.com), 1600, De Gruyter Oldenbourg (23):