A Neural Model of MST and MT Explains Perceived Object Motion during Self-Motion

被引:26
|
作者
Layton, Oliver W. [1 ]
Fajen, Brett R. [1 ]
机构
[1] Rensselaer Polytech Inst, Dept Cognit Sci, 110 8th St, Troy, NY 12180 USA
来源
JOURNAL OF NEUROSCIENCE | 2016年 / 36卷 / 31期
关键词
feedback; heading; MSTd; MT; object motion; self-motion; TEMPORAL VISUAL AREA; CLASSICAL RECEPTIVE-FIELD; OPTIC FLOW STIMULI; NEURONS; SELECTIVITY; PERCEPTION; DIRECTION; MECHANISMS; RESPONSES; DYNAMICS;
D O I
10.1523/JNEUROSCI.4593-15.2016
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
When a moving object cuts in front of a moving observer at a 90 degrees angle, the observer correctly perceives that the object is traveling along a perpendicular path just as if viewing the moving object from a stationary vantage point. Although the observer's own (self-)motion affects the object's pattern of motion on the retina, the visual system is able to factor out the influence of self-motion and recover the world-relative motion of the object (Matsumiya and Ando, 2009). This is achieved by using information in global optic flow (Rushton and Warren, 2005; Warren and Rushton, 2009; Fajen and Matthis, 2013) and other sensory arrays (Dupin and Wexler, 2013; Fajen et al., 2013; Dokka et al., 2015) to estimate and deduct the component of the object's local retinal motion that is due to self-motion. However, this account (known as "flow parsing") is qualitative and does not shed light on mechanisms in the visual system that recover object motion during self-motion. We present a simple computational account that makes explicit possible mechanisms in visual cortex by which self-motion signals in the medial superior temporal area interact with object motion signals in the middle temporal area to transform object motion into a world-relative reference frame. The model (1) relies on two mechanisms (MST-MT feedback and disinhibition of opponent motion signals in MT) to explain existing data, (2) clarifies how pathways for self-motion and object-motion perception interact, and (3) unifies the existing flow parsing hypothesis with established neurophysiological mechanisms.
引用
收藏
页码:8093 / 8102
页数:10
相关论文
共 50 条
  • [21] A computational model for the detection of object motion by moving observer using self-motion signals
    Miura, K
    Nagano, T
    [J]. INFORMATION SCIENCES, 2000, 123 (1-2) : 55 - 73
  • [22] The impact of visually simulated self-motion on predicting object motion
    Jorges, Bjorn
    Harris, Laurence R.
    [J]. PLOS ONE, 2024, 19 (03):
  • [23] DECOMPOSITION OF RETINAL IMAGE MOTION INTO OBJECT-STRUCTURE, OBJECT-MOTION AND SELF-MOTION
    KITAZAKI, M
    SHIMOJO, S
    [J]. INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 1994, 35 (04) : 1275 - 1275
  • [24] A computational model of motion sickness dynamics during passive self-motion in the dark
    Allred, Aaron R.
    Clark, Torin K.
    [J]. EXPERIMENTAL BRAIN RESEARCH, 2023, 241 (09) : 2311 - 2332
  • [25] A computational model of motion sickness dynamics during passive self-motion in the dark
    Allred, Aaron R.
    Clark, Torin K.
    [J]. EXPERIMENTAL BRAIN RESEARCH, 2024, 242 (05) : 1127 - 1148
  • [26] A CORTICAL SUBSTRATE FOR MOTION PERCEPTION DURING SELF-MOTION
    THIER, P
    ERICKSON, RG
    DICHGANS, J
    [J]. BEHAVIORAL AND BRAIN SCIENCES, 1994, 17 (02) : 335 - 335
  • [27] Detection thresholds for object motion and self-motion during vestibular and visuo-oculomotor stimulation
    Kolev, O
    Mergner, T
    Kimmig, H
    Becker, W
    [J]. BRAIN RESEARCH BULLETIN, 1996, 40 (5-6) : 451 - 457
  • [28] COMPUTATIONAL ASPECTS OF MOTION PERCEPTION DURING SELF-MOTION
    HADANI, I
    JULESZ, B
    [J]. BEHAVIORAL AND BRAIN SCIENCES, 1994, 17 (02) : 319 - 320
  • [29] Depth information and perceived self-motion during simulated gaze rotations
    Ehrlich, SM
    Beck, DM
    Crowell, JA
    Freeman, TCA
    Banks, MS
    [J]. VISION RESEARCH, 1998, 38 (20) : 3129 - 3145
  • [30] How Does the Brain Tell Self-Motion from Object Motion?
    Wild, Benedict
    [J]. JOURNAL OF NEUROSCIENCE, 2018, 38 (16): : 3875 - 3877