Training Merino sheep to respond to visual and auditory cues

被引:5
|
作者
Taylor, Donnalee B. [1 ,3 ]
Brown, Wendy Y. [1 ]
Price, Ian R. [2 ]
Hinch, Geoff N. [1 ,3 ]
机构
[1] Univ New England, Sch Environm & Rural Sci Anim Sci, Armidale, NSW 2351, Australia
[2] Univ New England, Sch Cognit Behav & Social Sci Psychol, Armidale, NSW 2351, Australia
[3] Australian Sheep Ind Cooperat Res Ctr, Armidale, NSW 2351, Australia
来源
ANIMAL PRODUCTION SCIENCE | 2010年 / 50卷 / 5-6期
关键词
SPATIAL MEMORY;
D O I
10.1071/AN10003
中图分类号
S8 [畜牧、 动物医学、狩猎、蚕、蜂];
学科分类号
0905 ;
摘要
The feasibility of training sheep to approach a stimulus was investigated in three experimental groups and a control group of. ne wool Merino sheep (n = 11 in each group). The experimental groups (n = 11) were trained to approach either a visual (V), auditory (A), or visual + auditory (V+A) stimulus over eight training sessions and were subsequently tested in a T-maze for memoryretention on six occasions over a 4-month period. Four testing occasions were spaced at greater than 30 days apart while two were less than 3 days apart. Sheep learned to approach the cues during the training period and the tests indicated that the sheep retained memory of the cues for over 130 days without reinforcement. The controls received no contingency exposure and made no choice in the T-maze test. The proportion of correct stimulus choices (+/-s.e.m.) in the T-maze averaged over the four longer-spaced testing occasions were V 0.61 (0.06), A 0.50 (0.11), V+A 0.77 (0.04). These differences approached significance (P = 0.058). Sheep trained to the V+A stimulus performed significantly (P < 0.05) better in the T-maze than sheep trained to the Astimulus alone. Comparisons over the shorter-spaced testing durations indicated that the sheep quickly learnt not to approach the stimulus (temporary extinction) when no food reward was available. Individual temperament of the animals was not related to their learning. This study highlights the potential for the use of V and A cues in manipulating the movement of sheep which may be useful for farm management purposes.
引用
收藏
页码:541 / 545
页数:5
相关论文
共 50 条
  • [1] Fractal gait training in older adults with auditory and visual cues
    Stout, Ruth
    Carder, W. P.
    Rhea, Christopher
    JOURNAL OF SPORT & EXERCISE PSYCHOLOGY, 2018, 40 : S67 - S67
  • [2] Motor training: comparison of visual and auditory coded proprioceptive cues
    Jepson, Philip
    Pelah, Adar
    I-PERCEPTION, 2012, 3 (04): : 249 - 249
  • [3] Video Shrinking by Auditory and Visual Cues
    Xu, Qianqian
    Liu, Huiying
    Jiang, Shuqiang
    Huang, Qingming
    Gong, Yu
    ADVANCES IN MULTIMEDIA INFORMATION PROCESSING - PCM 2009, 2009, 5879 : 788 - +
  • [4] Contribution of auditory and visual cues to navigation
    van der Ham, Ineke J. M.
    Zenka, Anna
    van der Kuil, Milan N. A.
    COGNITIVE PROCESSING, 2012, 13 : S12 - S12
  • [5] Auditory, visual, and tactile timing cues in visual search
    Karns, CM
    Horton, DM
    Giorgi, GM
    Knight, RT
    JOURNAL OF COGNITIVE NEUROSCIENCE, 2005, : 217 - 217
  • [6] SOUND LOCALIZATION WITH CONFLICTING VISUAL AND AUDITORY CUES
    WITKIN, HA
    WAPNER, S
    LEVENTHAL, T
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 1952, 43 (01): : 58 - 67
  • [7] Comparison of Visual, Content, and Auditory Cues in Interviewing
    Giedt, F. Harold
    JOURNAL OF CONSULTING PSYCHOLOGY, 1955, 19 (06): : 407 - 416
  • [8] Auditory emotional cues enhance visual perception
    Zeelenberg, Rene
    Bocanegra, Bruno R.
    COGNITION, 2010, 115 (01) : 202 - 206
  • [9] Visual form cues, biological motions, auditory cues, and even olfactory cues interact to affect visual sex discriminations
    Van Der Zwan, Rick
    Brooks, Anna
    Blair, Duncan
    Machatch, Coralia
    Hacker, Graeme
    I-PERCEPTION, 2011, 2 (04): : 361 - 361
  • [10] Visual-Auditory Redirection: Multimodal Integration of Incongruent Visual and Auditory Cues for Redirected Walking
    Gao, Peizhong
    Matsumoto, Keigo
    Narumi, Takuji
    Hirose, Michitaka
    2020 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR 2020), 2020, : 639 - 648