Saccadic eye movements to visual and auditory targets

被引:50
|
作者
Yao, LJ [1 ]
Peck, CK [1 ]
机构
[1] UNIV MISSOURI, SCH OPTOMETRY, ST LOUIS, MO 63121 USA
关键词
reaction time; saccadic latency; saccadic eye movement; ocular motor system; human;
D O I
10.1007/PL00005682
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Recent neurophysiological studies of the saccadic ocular motor system have lent support to the hypothesis that this system uses a motor error signal in retinotopic coordinates to direct saccades to both visual and auditory targets. With visual targets, the coordinates of the sensory and motor error signals will be identical unless the eyes move between the time of target presentation and the time of saccade onset. However, targets from other modalities must undergo different sensory-motor transformations to access the same motor error map. Because auditory targets are initially localized in head-centered coordinates, analyzing the metrics of saccades from different starting positions allows a determination of whether the coordinates of the motor signals are those of the sensory system. We studied six human subjects who made saccades to visual or auditory targets from a central fixation point or from one at 10 degrees to the right or left of the midline of the head. Although the latencies of saccades to visual targets increased as stimulus eccentricity increased, the latencies of saccades to auditory targets decreased as stimulus eccentricity increased. The longest auditory latencies were for the smallest values of motor error (the difference between target position and fixation eye position) or desired saccade size, regardless of the position of the auditory target relative to the head or the amplitude of the executed saccade. Similarly, differences in initial eye position did not affect the accuracy of saccades of the same desired size. When saccadic error was plotted as a function of motor error, the curves obtained at the different fixation positions overlapped completely. Thus, saccadic programs in the central nervous system compensated for eye position regardless of the modality of the saccade target, supporting the hypothesis that the saccadic ocular motor system uses motor error signals to direct saccades to auditory targets.
引用
下载
收藏
页码:25 / 34
页数:10
相关论文
共 50 条
  • [21] Visual and auditory cue integration for the generation of saccadic eye movements in monkeys and lever pressing in humans
    Schiller, Peter H.
    Kwak, Michelle C.
    Slocum, Warren M.
    EUROPEAN JOURNAL OF NEUROSCIENCE, 2012, 36 (04) : 2500 - 2504
  • [22] Saccadic latencies to visual targets following auditory cues
    Kean, MR
    Crawford, TJ
    SCHIZOPHRENIA RESEARCH, 2006, 81 : 109 - 110
  • [23] SACCADIC RESPONSES EVOKED BY PRESENTATION OF VISUAL AND AUDITORY TARGETS
    ZAMBARBIERI, D
    SCHMID, R
    MAGENES, G
    PRABLANC, C
    EXPERIMENTAL BRAIN RESEARCH, 1982, 47 (03) : 417 - 427
  • [24] Table tennis players use superior saccadic eye movements to track moving visual targets
    Nakazato, Riku
    Aoyama, Chisa
    Komiyama, Takaaki
    Himo, Ryoto
    Shimegi, Satoshi
    FRONTIERS IN SPORTS AND ACTIVE LIVING, 2024, 6
  • [25] The influence of geometric visual illusions on saccadic eye movements
    Tegetmeyer, H
    Wenger, A
    KLINISCHE MONATSBLATTER FUR AUGENHEILKUNDE, 2006, 223 (01) : 84 - 87
  • [26] Visual scene memory and the guidance of saccadic eye movements
    Melcher, D
    Kowler, E
    VISION RESEARCH, 2001, 41 (25-26) : 3597 - 3611
  • [27] Aging and shifts of visual attention in saccadic eye movements
    Kaneko, R
    Kuba, Y
    Sakata, Y
    Kuchinomachi, Y
    EXPERIMENTAL AGING RESEARCH, 2004, 30 (02) : 149 - 162
  • [28] VISUAL INTEGRATION ACROSS SACCADIC EYE-MOVEMENTS
    IRWIN, DE
    BROWN, JS
    BULLETIN OF THE PSYCHONOMIC SOCIETY, 1988, 26 (06) : 525 - 525
  • [29] SACCADIC EYE-MOVEMENTS AND THE PERCEPTION OF VISUAL DIRECTION
    HERSHBERGER, W
    PERCEPTION & PSYCHOPHYSICS, 1987, 41 (01): : 35 - 44
  • [30] Visual Feature Prediction Before Saccadic Eye Movements
    Herwig, Arvid
    Poth, Christian H.
    PERCEPTION, 2019, 48 : 202 - 202