Temporal relationship between auditory and visual prosodic cues

被引:0
|
作者
Cvejic, Erin [1 ]
Kim, Jeesun [1 ]
Davis, Chris [1 ]
机构
[1] Univ Western Sydney, MARCS Auditory Labs, Penrith, NSW 1797, Australia
关键词
visual prosody; focus; phrasing; guided principal component analysis; PERCEPTION; VOICE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
It has been reported that non-articulatory visual cues to prosody tend to align with auditory cues, emphasizing auditory events that are in close alignment (visual alignment hypothesis). We investigated the temporal relationship between visual and auditory prosodic cues in a large corpus of utterances to determine the extent to which non-articulatory visual prosodic cues align with auditory ones. Six speakers saying 30 sentences in three prosodic conditions (x2 repetitions) were recorded in a dialogue exchange task, to measure how often eyebrow movements and rigid head tilts aligned with auditory prosodic cues, the temporal distribution of such movements, and the variation across prosodic conditions. The timing of brow raises and head tilts were not aligned with auditory cues, and the occurrence of visual cues was inconsistent, lending little support for the visual alignment hypothesis. Different types of visual cues may combine with auditory cues in different ways to signal prosody.
引用
收藏
页码:988 / 991
页数:4
相关论文
共 50 条
  • [31] RELATIONSHIP BETWEEN VISUAL AND AUDITORY-DISCRIMINATION AND ANXIETY LEVEL
    JONES, O
    JOURNAL OF GENERAL PSYCHOLOGY, 1958, 59 (01): : 111 - 118
  • [32] Relationship between intraindividual auditory and visual attention in children with ADHD
    Lin, Hung-Yu
    Chang, Wen-Dien
    Hsieh, Hsieh-Chun
    Yu, Wan-Hui
    Lee, Posen
    RESEARCH IN DEVELOPMENTAL DISABILITIES, 2021, 108
  • [33] Visual form cues, biological motions, auditory cues, and even olfactory cues interact to affect visual sex discriminations
    Van Der Zwan, Rick
    Brooks, Anna
    Blair, Duncan
    Machatch, Coralia
    Hacker, Graeme
    I-PERCEPTION, 2011, 2 (04): : 361 - 361
  • [34] Visual-Auditory Redirection: Multimodal Integration of Incongruent Visual and Auditory Cues for Redirected Walking
    Gao, Peizhong
    Matsumoto, Keigo
    Narumi, Takuji
    Hirose, Michitaka
    2020 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR 2020), 2020, : 639 - 648
  • [35] The effects of visual beats on prosodic prominence: Acoustic analyses, auditory perception and visual perception
    Krahmer, Emiel
    Swerts, Marc
    JOURNAL OF MEMORY AND LANGUAGE, 2007, 57 (03) : 396 - 414
  • [36] Suboptimal Auditory Dominance in Audiovisual Integration of Temporal Cues
    M Maiworm
    B Rder
    TsinghuaScienceandTechnology, 2011, 16 (02) : 121 - 132
  • [37] Dynamic auditory cues modulate visual motion processing
    Teramoto, W.
    Hidaka, S.
    Gyoba, J.
    Suzuki, Y.
    PERCEPTION, 2008, 37 : 72 - 72
  • [38] Conversational Engagement Recognition Using Auditory and Visual Cues
    Huang, Yuyun
    Gilmartin, Emer
    Campbell, Nick
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 590 - 594
  • [39] Surface Stickiness Perception by Auditory, Tactile, and Visual Cues
    Lee, Hyungeol
    Lee, Eunsil
    Jung, Jiye
    Kim, Junsuk
    FRONTIERS IN PSYCHOLOGY, 2019, 10
  • [40] Hemifield asymmetry in the potency of exogenous auditory and visual cues
    Sosa, Yamaya
    Clarke, Aaron M.
    McCourt, Mark E.
    VISION RESEARCH, 2011, 51 (11) : 1207 - 1215