GaVe: A Webcam-Based Gaze Vending Interface Using One-Point Calibration

被引:4
|
作者
Zeng, Zhe [1 ]
Liu, Sai [1 ]
Cheng, Hao [2 ]
Liu, Hailong [3 ]
Li, Yang [4 ]
Feng, Yu [5 ]
Siebert, Felix Wilhelm [6 ]
机构
[1] Tech Univ Berlin, Berlin, Germany
[2] Univ Twente, Enschede, Netherlands
[3] Nara Inst Sci & Technol, Nara, Japan
[4] Karlsruhe Inst Technol, Karlsruhe, Germany
[5] Leibniz Univ Hannover, Hannover, Germany
[6] Tech Univ Denmark, Lyngby, Denmark
来源
JOURNAL OF EYE MOVEMENT RESEARCH | 2023年 / 16卷 / 01期
关键词
Human-computer interaction; gaze interaction; touchless; gaze; eye movement; eye tracking; usability; dwell time; EYE-MOVEMENTS;
D O I
10.16910/jemr.16.1.2
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
Gaze input, i.e., information input via eye of users, represents a promising method for con-tact-free interaction in human-machine systems. In this paper, we present the GazeVending interface (GaVe), which lets users control actions on a display with their eyes. The interface works on a regular webcam, available on most of today's laptops, and only requires a short one-point calibration before use. GaVe is designed in a hierarchical structure, presenting broad item cluster to users first and subsequently guiding them through another selection round, which allows the presentation of a large number of items. Cluster/item selection in GaVe is based on the dwell time, i.e., the time duration that users look at a given Clus-ter/item. A user study (N=22) was conducted to test optimal dwell time thresholds and com-fortable human-to-display distances. Users' perception of the system, as well as error rates and task completion time were registered. We found that all participants were able to quickly understand and know how to interact with the interface, and showed good performance, selecting a target item within a group of 12 items in 6.76 seconds on average. We provide design guidelines for GaVe and discuss the potentials of the system.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Webcam-Based Visual Gaze Estimation
    Valenti, Roberto
    Staiano, Jacopo
    Sebe, Nicu
    Gevers, Theo
    IMAGE ANALYSIS AND PROCESSING - ICIAP 2009, PROCEEDINGS, 2009, 5716 : 662 - +
  • [2] One-point Calibration Gaze Tracking Based on Eyeball Kinematics Using Stereo Cameras
    Nagamatsu, Takashi
    Kamahara, Junzo
    Iko, Takumi
    Tanaka, Naoki
    PROCEEDINGS OF THE EYE TRACKING RESEARCH AND APPLICATIONS SYMPOSIUM (ETRA 2008), 2008, : 95 - 98
  • [3] A novel gaze estimation method with one-point calibration
    Xiong, C.-S. (chunshui.xiong@ia.ac.cn), 1600, Science Press (40):
  • [4] Webcam-based gaze estimation for computer screen interaction
    Falch, Lucas
    Lohan, Katrin Solveig
    FRONTIERS IN ROBOTICS AND AI, 2024, 11
  • [5] Driver’s eye-based gaze tracking system by one-point calibration
    Hyo Sik Yoon
    Hyung Gil Hong
    Dong Eun Lee
    Kang Ryoung Park
    Multimedia Tools and Applications, 2019, 78 : 7155 - 7179
  • [6] Driver's eye-based gaze tracking system by one-point calibration
    Yoon, Hyo Sik
    Hong, Hyung Gil
    Lee, Dong Eun
    Park, Kang Ryoung
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (06) : 7155 - 7179
  • [7] Exploration of factors affecting webcam-based automated gaze coding
    Hagihara, Hiromichi
    Zaadnoordijk, Lorijn
    Cusack, Rhodri
    Kimura, Nanako
    Tsuji, Sho
    BEHAVIOR RESEARCH METHODS, 2024, 56 (07) : 7374 - 7390
  • [8] Webcam-Based Visual Gaze Estimation Under Desktop Environment
    Yu, Shujian
    Ou, Weihua
    You, Xinge
    Jiang, Xiubao
    Zhu, Yun
    Mou, Yi
    Guo, Weigang
    Tang, Yuanyan
    Chen, C. L. Philip
    NEURAL INFORMATION PROCESSING, PT II, 2015, 9490 : 457 - 466
  • [9] A One-Point Calibration Design for Hybrid Eye Typing Interface
    Zeng, Zhe
    Neuer, Elisabeth Sumithra
    Roetting, Matthias
    Siebert, Felix Wilhelm
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2023, 39 (18) : 3620 - 3633
  • [10] MoMa: An assistive mobile manipulator with a webcam-based gaze control system
    Go, James Dominic O.
    Ong, Neal Garnett T.
    Rafanan, Carlo A.
    Tan, Brian G.
    Chu, Timothy Scott C.
    HARDWAREX, 2024, 20