Novel eye-based features for head pose-free gaze estimation with web camera: New model and low-cost device

被引:10
|
作者
Aunsri, Nattapol [1 ,2 ]
Rattarom, Suwitchaya [1 ]
机构
[1] Mae Fah Luang Univ, Sch Informat Technol, Chiang Rai 57100, Thailand
[2] Mae Fah Luang Univ, Comp & Commun Engn Capac Bldg Res Ctr, Chiang Rai 57100, Thailand
关键词
Eye gaze; Human computer interaction; Feature-based gaze estimation; Feature extraction; Machine learning; Low cost camera; TRACKING TECHNIQUES; MACHINE;
D O I
10.1016/j.asej.2022.101731
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
In this paper, we propose a new set of efficient features for the gaze estimation system that successfully works with a simple and cheap eye gazing system. These features are composed of Pupil - Glint vectors, Pupil - inner-eye-Corner (canthus) vectors, Glint - inner-eye-Corner vectors, distance vector that connects 2 inner-eye-corners, angles between Pupil - Glint vectors and the Glint - inner-eye-Corner vectors, and deviation angle. These features together can capture the head movement and positions of the eyes relative to the position of the screen effectively. From the investigating results, ANN with 2 hidden layers provided the best classification in 15 regions of interest under the proposed features which has an accuracy of 97.71%, delivering better performance than the other techniques. Our proposed features are applicable for real-time applications and fit with low cost and simple system, therefore, it can be widely used for disabled and special need persons in various HCI applications.(c) 2022 THE AUTHORS. Published by Elsevier BV on behalf of Faculty of Engineering, Ain Shams University This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-ncnd/4.0/).
引用
收藏
页数:12
相关论文
共 10 条
  • [1] Model-based head pose-free gaze estimation for assistive communication
    Cristina, Stefania
    Camilleri, Kenneth P.
    [J]. COMPUTER VISION AND IMAGE UNDERSTANDING, 2016, 149 : 157 - 170
  • [2] A Head Pose-free Approach for Appearance-based Gaze Estimation
    Lu, Feng
    Okabe, Takahiro
    Sugano, Yusuke
    Sato, Yoichi
    [J]. PROCEEDINGS OF THE BRITISH MACHINE VISION CONFERENCE 2011, 2011,
  • [3] Gaze Estimation From Eye Appearance: A Head Pose-Free Method via Eye Image Synthesis
    Lu, Feng
    Sugano, Yusuke
    Okabe, Takahiro
    Sato, Yoichi
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (11) : 3680 - 3693
  • [4] Head Pose-Free Appearance-Based Gaze Sensing via Eye Image Synthesis
    Lu, Feng
    Sugano, Yusuke
    Okabe, Takahiro
    Sato, Yoichi
    [J]. 2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 1008 - 1011
  • [5] A Framework for Polynomial Model with Head Pose in Low Cost Gaze Estimation
    Rattarom, Suwitchaya
    Aunsri, Nattapol
    Uttama, Surapong
    [J]. 2017 INTERNATIONAL CONFERENCE ON DIGITAL ARTS, MEDIA AND TECHNOLOGY (ICDAMT): DIGITAL ECONOMY FOR SUSTAINABLE GROWTH, 2017, : 24 - 27
  • [6] Gaze Estimation From Color Image Based on the Eye Model With Known Head Pose
    Li, Jianfeng
    Li, Shigang
    [J]. IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2016, 46 (03) : 414 - 423
  • [7] An optical head-pose tracking sensor for pointing devices using IR-LED based markers and a low-cost camera
    Walsh, Edwin
    Daems, Walter
    Steckel, Jan
    [J]. 2015 IEEE SENSORS, 2015, : 43 - 46
  • [8] Low-Cost and Device-Free Human Activity Recognition Based on Hierarchical Learning Model
    Chen, Jing
    Huang, Xinyu
    Jiang, Hao
    Miao, Xiren
    [J]. SENSORS, 2021, 21 (07)
  • [9] Intra-abdominal hypertension assessment based on Near Infra-Red reflectometry: New model and low-cost device
    David, Marcelo
    Halevi, Netanel
    Hirshtal, Elad
    Peretz, Aviad
    Raviv, Aviad
    Pracca, Francisco
    [J]. MEASUREMENT, 2020, 154
  • [10] Contactless temperature and distance measuring device: A low-cost, novel infrared -based -Badge'- shaped structural model for measuring physical distance and body temperature
    Kumar, Abhijeet
    Kumar, Arpit
    [J]. AIMS Electronics and Electrical Engineering, 2022, 6 (01): : 43 - 60