Learned rather than online relative weighting of visual-proprioceptive sensory cues

被引:9
|
作者
Mikula, Laura [1 ,2 ]
Gaveau, Valerie [1 ]
Pisella, Laure [1 ]
Khan, Aarlenne Z. [2 ]
Blohm, Gunnar [3 ]
机构
[1] Lyon 1 Univ, Ctr Rech Neurosci Lyon, INSERM U1028, ImpAct Team,CNRS UMR 5292, Bron, France
[2] Univ Montreal, Sch Optometry, Montreal, PQ, Canada
[3] Queens Univ, Ctr Neurosci Studies, Bottere Hall,Rm 229,18 Stuart St, Kingston, ON K7L 3N6, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
multisensory integration; proprioception; reaching; vision; MULTISENSORY INTEGRATION; PSYCHOMETRIC FUNCTION; BAYESIAN INTEGRATION; PERCEPTION; ILLUSION; UNCERTAINTY; INFORMATION; MOVEMENT; BRAIN; SIGNALS;
D O I
10.1152/jn.00338.2017
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
When reaching to an object, information about the target location as well as the initial hand position is required to program the motor plan for the ann. The initial hand position can be determined by proprioceptive information as well as visual information, if available. Bayes-optimal integration posits that we utilize all information available, with greater weighting on the sense that is more reliable, thus generally weighting visual information more than the usually less reliable proprioceptive information. The criterion by which information is weighted has not been explicitly investigated; it has been assumed that the weights are based on task- and effector-dependent sensory reliability requiring an explicit neuronal representation of variability. However, the weights could also be determined implicitly through learned modality-specific integration weights and not on effector-dependent reliability. While the former hypothesis predicts different proprioceptive weights for left and right hands, e.g.. due to different reliabilities of dominant vs. nondominant hand proprioception, we would expect the same integration weights if the latter hypothesis was true. We found that the proprioceptive weights for the left and right hands were extremely consistent regardless of differences in sensory variability for the two hands as measured in two separate complementary tasks. Thus we propose that proprioceptive weights during reaching are learned across both hands, with high interindividual range but independent of each hand's specific proprioceptive variability. NEW & NOTEWORTHY How visual and proprioceptive information about the hand are integrated to plan a reaching movement is still debated. The goal of this study was to clarify how the weights assigned to vision and proprioception during multisensory integration are determined. We found evidence that the integration weights are modality specific rather than based on the sensory reliabilities of the effectors.
引用
收藏
页码:1981 / 1992
页数:12
相关论文
共 14 条
  • [2] The hand is more easily fooled than the eye: Users are more sensitive to visual interpenetration than to visual-proprioceptive discrepancy
    Burns, E
    Razzaque, S
    Panter, AT
    Whitton, MC
    McCallus, MR
    Brooks, FP
    [J]. PRESENCE-TELEOPERATORS AND VIRTUAL ENVIRONMENTS, 2006, 15 (01) : 1 - 15
  • [3] Sensory reweighting dynamics following removal and addition of visual and proprioceptive cues
    Asslaender, Lorenz
    Peterka, Robert J.
    [J]. JOURNAL OF NEUROPHYSIOLOGY, 2016, 116 (02) : 272 - 285
  • [4] Contributions of exercise-induced fatigue versus intertrial tendon vibration on visual-proprioceptive weighting for goal-directed movement
    Manzone, Damian M.
    Tremblay, Luc
    [J]. JOURNAL OF NEUROPHYSIOLOGY, 2020, 124 (03) : 802 - 814
  • [5] FOOD DETECTION BY DEER MICE USING OLFACTORY RATHER THAN VISUAL CUES
    HOWARD, WE
    MARSH, RE
    COLE, RE
    [J]. ANIMAL BEHAVIOUR, 1968, 16 (01) : 13 - &
  • [6] Elderly use proprioception rather than visual and vestibular cues for postural motor control
    Wiesmeier, Isabella Katharina
    Dalin, Daniela
    Maurer, Christoph
    [J]. FRONTIERS IN AGING NEUROSCIENCE, 2015, 7
  • [7] Using sensory weighting to model the influence of canal, otolith and visual cues on spatial orientation and eye movements
    L. H. Zupan
    D. M. Merfeld
    C. Darlot
    [J]. Biological Cybernetics, 2002, 86 : 209 - 230
  • [8] Using sensory weighting to model the influence of canal, otolith and visual cues on spatial orientation and eye movements
    Zupan, LH
    Merfeld, DM
    Darlot, C
    [J]. BIOLOGICAL CYBERNETICS, 2002, 86 (03) : 209 - 230
  • [9] Differences in sensory reweighting due to loss of visual and proprioceptive cues in postural stability support among sleep-deprived cadet pilots
    Chen, Shan
    Ma, Jin
    Sun, Jicheng
    Wang, Jian
    Xiao, Xiao
    Wang, Yihan
    Hu, Wendong
    [J]. GAIT & POSTURE, 2018, 63 : 97 - 103
  • [10] Visual rather than proprioceptive information contribute more to shape-from-shading when the light-source was actively moved
    Sato, T.
    Hosokawa, K.
    [J]. PERCEPTION, 2009, 38 : 33 - 33