Perceived touch location is coded using a gaze signal

被引:0
|
作者
Lisa M. Pritchett
Laurence R. Harris
机构
[1] York University,Department of Psychology & Centre for Vision Research
来源
关键词
Tactile coding; Eye position; Head position; Gaze; Visual representation of body; Coordinate transformations;
D O I
暂无
中图分类号
学科分类号
摘要
The location of a touch to the skin, first coded in body coordinates, may be transformed into retinotopic coordinates to facilitate visual-tactile integration. In order for the touch location to be transformed into a retinotopic reference frame, the location of the eyes and head must be taken into account. Previous studies have found eye position–related errors (Harrar and Harris in Exp Brain Res 203:615–620, 2009) and head position–related errors (Ho and Spence Brain Res 1144:136–141, 2007) in tactile localization, indicating that imperfect versions of eye and head signals may be used in the body-to-visual coordinate transformation. Here, we investigated the combined effects of head and eye position on the perceived location of a mechanical touch to the arm. Subjects reported the perceived position of a touch that was presented while their head was positioned to the left, right, or center of the body and their eyes were positioned to the left, right, or center in their orbits. The perceived location of a touch shifted in the direction of both head and the eyes by approximately the same amount. We interpret these shifts as being consistent with touch location being coded in a visual reference frame with a gaze signal used to compute the transformation.
引用
收藏
页码:229 / 234
页数:5
相关论文
共 50 条
  • [1] Perceived touch location is coded using a gaze signal
    Pritchett, Lisa M.
    Harris, Laurence R.
    [J]. EXPERIMENTAL BRAIN RESEARCH, 2011, 213 (2-3) : 229 - 234
  • [2] Eye position affects the perceived location of touch
    Vanessa Harrar
    Laurence R. Harris
    [J]. Experimental Brain Research, 2009, 198 : 403 - 410
  • [3] Eye position affects the perceived location of touch
    Harrar, Vanessa
    Harris, Laurence R.
    [J]. EXPERIMENTAL BRAIN RESEARCH, 2009, 198 (2-3) : 403 - 410
  • [4] Gaze'N'Touch: Enhancing Text Selection on Mobile Devices Using Gaze
    Rivu, Radiah
    Hassib, Mariam
    Abdrabou, Yasmeen
    Alt, Florian
    Pfeuffer, Ken
    [J]. CHI'20: EXTENDED ABSTRACTS OF THE 2020 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2020,
  • [5] Gaze plus touch vs. Touch: What's the Trade-off When Using Gaze to Extend Touch to Remote Displays?
    Pfeuffer, Ken
    Alexander, Jason
    Gellersen, Hans
    [J]. HUMAN-COMPUTER INTERACTION - INTERACT 2015, PT II, 2015, 9297 : 349 - 367
  • [6] Gaze and Touch Interaction on Tablets
    Pfeuffer, Ken
    Gellersen, Hans
    [J]. UIST 2016: PROCEEDINGS OF THE 29TH ANNUAL SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY, 2016, : 301 - 311
  • [7] Searching for a perceived gaze direction using eye tracking
    Palanica, Adam
    Itier, Roxane J.
    [J]. JOURNAL OF VISION, 2011, 11 (02):
  • [8] GazeLockPatterns: Comparing Authentication Using Gaze and Touch for Entering Lock Patterns
    Abdrabou, Yasmeen
    Pfeuffer, Ken
    Khamis, Mohamed
    Alt, Florian
    [J]. ETRA 2020 SHORT PAPERS: ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2020,
  • [9] Gaze cueing as a function of perceived gaze direction
    Qian, Qian
    Song, Miao
    Shinomori, Keizo
    [J]. JAPANESE PSYCHOLOGICAL RESEARCH, 2013, 55 (03) : 264 - 272
  • [10] Touch Input and Gaze Correlation on Tablets
    Weill-Tessier, Pierre
    Gellersen, Hans
    [J]. INTELLIGENT DECISION TECHNOLOGIES 2017, KES-IDT 2017, PT II, 2018, 73 : 287 - 296