Real-time eye tracking for the assessment of driver fatigue

被引:57
|
作者
Xu, Junli [1 ]
Min, Jianliang [1 ]
Hu, Jianfeng [1 ]
机构
[1] Jiangxi Univ Technol, Ctr Collaborat & Innovat, Yao Lake Univ Pk, Nanchang 330098, Jiangxi, Peoples R China
基金
中国国家自然科学基金;
关键词
gaze tracking; sensors; fuzzy systems; computerised monitoring; driver fatigue assessment; real-time eye movement tracking device; eye-movement data collection; eye state monitoring; driving simulator; pupil area recording; fuzzy k-nearest neighbour; jackknife validation; time 1 h to 2 h;
D O I
10.1049/htl.2017.0020
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Eye-tracking is an important approach to collect evidence regarding some participants' driving fatigue. In this contribution, the authors present a non-intrusive system for evaluating driver fatigue by tracking eye movement behaviours. A real-time eye-tracker was used to monitor participants' eye state for collecting eye-movement data. These data are useful to get insights into assessing participants' fatigue state during monotonous driving. Ten healthy subjects performed continuous simulated driving for 1-2 h with eye state monitoring on a driving simulator in this study, and these measured features of the fixation time and the pupil area were recorded via using eye movement tracking device. For achieving a good cost-performance ratio and fast computation time, the fuzzy K-nearest neighbour was employed to evaluate and analyse the influence of different participants on the variations in the fixation duration and pupil area of drivers. The findings of this study indicated that there are significant differences in domain value distribution of the pupil area under the condition with normal and fatigue driving state. Result also suggests that the recognition accuracy by jackknife validation reaches to about 89% in average, implying that show a significant potential of real-time applicability of the proposed approach and is capable of detecting driver fatigue.
引用
收藏
页码:54 / 58
页数:5
相关论文
共 50 条
  • [31] REAL-TIME EYE TRACKING USING HEAT MAPS
    Krishnan, Chetana
    Jeyakumar, Vijay
    Raj, Alex Noel Jospeh
    MALAYSIAN JOURNAL OF COMPUTER SCIENCE, 2022, 35 (04) : 339 - +
  • [32] Design and implementation of a real-time eye tracking system
    Peng, Yan
    Zhou, Tian
    Wang, Shao-Peng
    Cheng, Du
    Journal of China Universities of Posts and Telecommunications, 2013, 20 (SUPPL. 1): : 1 - 5
  • [33] REAL-TIME EYE TRACKING - PREREQUISITES AND PRINCIPLES OF SOLUTION
    JEAN, B
    KAZMIERCZAK, H
    GRUNERT, T
    THIEL, HJ
    KLINISCHE MONATSBLATTER FUR AUGENHEILKUNDE, 1991, 198 (06) : 538 - 543
  • [34] Real-time facial and eye gaze tracking system
    Park, KR
    Kim, J
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2005, E88D (06): : 1231 - 1238
  • [35] PuReST: Robust Pupil Tracking for Real-Time Pervasive Eye Tracking
    Santini, Thiago
    Fuhl, Wolfgang
    Kasneci, Enkelejda
    2018 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2018), 2018,
  • [36] Eye localization and tracking method in driver fatigue detection
    Zhu Lei
    Zhu Shan-an
    Li Yun-han
    Proceedings of 2005 Chinese Control and Decision Conference, Vols 1 and 2, 2005, : 741 - 744
  • [37] Automatic fatigue detection of drivers through real time eye tracking
    Tayyaba, Azim
    Arfan Jaffar, M.
    Mirza, Anwar M.
    ICIC Express Letters, 2010, 4 (02): : 341 - 346
  • [38] Low Cost Real-time Eye Tracking System for Motorsports
    Xia, Yuanjie
    Lunardi, Andrew
    Heidari, Hadi
    Ghannam, Rami
    2022 29TH IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, CIRCUITS AND SYSTEMS (IEEE ICECS 2022), 2022,
  • [39] EyeLiveMetrics: Real-time Analysis of Online Reading with Eye Tracking
    Hienert, Daniel
    Schmidt, Heiko
    Kraemer, Thomas
    Kern, Dagmar
    PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2024, 2024,
  • [40] A Real-Time Eye Gaze Tracking Based Digital Mouse
    Kwak, SeHyun
    Lee, Daeho
    Kim, Siwon
    Park, Junghoon
    INNOVATIVE MOBILE AND INTERNET SERVICES IN UBIQUITOUS COMPUTING, IMIS 2024, 2024, 214 : 39 - 46