A gaze-based interaction system for people with cerebral palsy

被引:12
|
作者
Galante, Adriano [1 ]
Menezes, Paulo [1 ]
机构
[1] Univ Coimbra, Dept Elect & Comp Engn, Inst Syst & Robot, P-3030790 Coimbra, Portugal
关键词
Cerebral palsy; physical disabilities; eye tracker; gaze tracking; communication; human-computer interaction; board communication;
D O I
10.1016/j.protcy.2012.09.099
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper we present the development of an augmentative system for people with movement disabilities ( mostly cerebral palsy people) to communicate with the people that surround them, through a human-computer interaction mechanism. We developed an assistive technology application based on gaze tracking in order to select symbols in communication boards, which represent words or ideas, so that they could easily create phrases for the patient's daily needs. This kind of communication boards is already used by cerebral palsy people, here we intend to extend their use for people with absolute no motor coordination. An important improvement of the proposed system with respect to the existing solutions is the ability to work in the presence of voluntary or involuntary head movements. (C) 2012 Published by Elsevier Ltd. Selection and/or peer review under responsibility of CENTERIS/SCIKA - Association for Promotion and Dissemination of Scientific Knowledge
引用
收藏
页码:895 / 902
页数:8
相关论文
共 50 条
  • [41] Towards Gaze-based Video Annotation
    Soliman, Mohamed
    R-Tavakoli, Hamed
    Laaksonen, Jorma
    2016 SIXTH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA), 2016,
  • [42] Gaze-Based Annotations for Reading Comprehension
    Cheng, Shiwei
    Sun, Zhiqiang
    Sun, Lingyun
    Yee, Kirsten
    Dey, Anind K.
    CHI 2015: PROCEEDINGS OF THE 33RD ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2015, : 1569 - 1572
  • [43] GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public Displays
    Namnakani, Omar
    Sinrattanavong, Penpicha
    Abdrabou, Yasmeen
    Bulling, Andreas
    Alt, Florian
    Khamis, Mohamed
    ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2023, 2023,
  • [44] An Adaptive Model of Gaze-based Selection
    Chen, Xiuli
    Acharya, Aditya
    Oulasvirta, Antti
    Howes, Andrew
    CHI '21: PROCEEDINGS OF THE 2021 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2021,
  • [45] GazeEMD: Detecting Visual Intention in Gaze-Based Human-Robot Interaction
    Shi, Lei
    Copot, Cosmin
    Vanlanduit, Steve
    ROBOTICS, 2021, 10 (02)
  • [46] Gaze-based interactions in the cockpit of the future: a survey
    Rudi, David
    Kiefer, Peter
    Giannopoulos, Ioannis
    Raubal, Martin
    JOURNAL ON MULTIMODAL USER INTERFACES, 2020, 14 (01) : 25 - 48
  • [47] EyeBoard: A Fast and Accurate Eye Gaze-Based Text Entry System
    Panwar, Prateek
    Sarcar, Sayan
    Samanta, Debasis
    4TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN COMPUTER INTERACTION (IHCI 2012), 2012,
  • [48] Optimum object selection methods for spontaneous gaze-based interaction with linear and circular trajectories
    Nurlatifa, Hafzatin
    Hartanto, Rudy
    Ataka, Ahmad
    Wibirama, Sunu
    RESULTS IN ENGINEERING, 2024, 21
  • [49] Evaluating requirements for gaze-based interaction in a see-through head mounted display
    Graupner, Sven-Thomas
    Heubner, Michael
    Pannasch, Sebastian
    Velichkovsky, Boris M.
    PROCEEDINGS OF THE EYE TRACKING RESEARCH AND APPLICATIONS SYMPOSIUM (ETRA 2008), 2008, : 91 - 94
  • [50] Visual Intention Classification by Deep Learning for Gaze-based Human-Robot Interaction
    Shi, Lei
    Copot, Cosmin
    Vanlanduit, Steve
    IFAC PAPERSONLINE, 2020, 53 (05): : 750 - 755