Auditory-Visual Integration of Emotional Signals in a Virtual Environment for Cynophobia

被引:0
|
作者
Taffou, Marine [1 ]
Chapoulie, Emmanuelle [2 ]
David, Adrien [2 ]
Guerchouche, Rachid [2 ]
Drettakis, George [2 ]
Viaud-Delmon, Isabelle [1 ]
机构
[1] UPMC, CNRS, IRCAM, UMR 9912, 1 Pl Igor Stravinsky, F-75004 Paris, France
[2] INRIA, REVES, Sophia Antipolis, France
关键词
Dog phobia; spatial audition; multisensory integration; emotion and therapy;
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Cynophobia (dog phobia) has both visual and auditory relevant components. In order to investigate the efficacy of virtual reality (VR) exposure-based treatment for cynophobia, we studied the efficiency of auditory-visual environments in generating presence and emotion. We conducted an evaluation test with healthy participants sensitive to cynophobia in order to assess the capacity of auditory-visual virtual environments (VE) to generate fear reactions. Our application involves both high fidelity visual stimulation displayed in an immersive space and 3D sound. This specificity enables us to present and spatially manipulate fearful stimuli in the auditory modality, the visual modality and both. Our specific presentation of animated dog stimuli creates an environment that is highly arousing, suggesting that VR is a promising tool for cynophobia treatment and that manipulating auditory-visual integration might provide a way to modulate affect.
引用
收藏
页码:238 / 242
页数:5
相关论文
共 50 条
  • [1] Auditory-visual integration of emotional signals in a virtual environment for cynophobia
    Taffou, M. (marine.taffou@ircam.fr), 1600, IOS Press BV (181):
  • [2] Auditory-Visual Virtual Reality as a Diagnostic and Therapeutic Tool for Cynophobia
    Suied, Clara
    Drettakis, George
    Warusfel, Olivier
    Viaud-Delmon, Isabelle
    CYBERPSYCHOLOGY BEHAVIOR AND SOCIAL NETWORKING, 2013, 16 (02) : 145 - 152
  • [3] The auditory-visual multisensory neurons and auditory-visual information integration in rat cortex
    Yu Li-Ping
    Wang Xiao-Yan
    Li Xiang-Yao
    Zhang Ji-Ping
    Sun Xin-De
    PROGRESS IN BIOCHEMISTRY AND BIOPHYSICS, 2006, 33 (07) : 677 - 684
  • [4] Auditory-visual integration in fields of the auditory cortex
    Kubota, Michinori
    Sugimoto, Shunji
    Hosokawa, Yutaka
    Ojima, Hisayuki
    Horikawa, Junsei
    HEARING RESEARCH, 2017, 346 : 25 - 33
  • [5] Auditory-Visual Virtual Reality for the Study of Multisensory Integration in Insect Navigation
    Makino, Koki
    Ando, Noriyasu
    Shidara, Hisashi
    Hommaru, Naoto
    Kanzaki, Ryohei
    Ogawa, Hiroto
    BIOMIMETIC AND BIOHYBRID SYSTEMS, LIVING MACHINES 2019, 2019, 11556 : 325 - 328
  • [6] NEW TEST OF AUDITORY-VISUAL INTEGRATION
    GREGORY, AH
    GREGORY, HM
    PERCEPTUAL AND MOTOR SKILLS, 1973, 36 (03) : 1063 - 1066
  • [7] Auditory-visual integration for biological motion
    Meyer, G.
    Wuerger, S. M.
    PERCEPTION, 2007, 36 : 171 - 171
  • [8] AUDITORY AND AUDITORY-VISUAL INTEGRATION SKILLS AS THEY RELATE TO READING
    EVANS, JR
    READING TEACHER, 1969, 22 (07): : 625 - 629
  • [9] Auditory-visual integration during nonconscious perception
    Ching, April Shi Min
    Kim, Jeesun
    Davis, Chris
    CORTEX, 2019, 117 : 1 - 15
  • [10] A computational model of early auditory-visual integration
    Schauer, C
    Gross, HM
    PATTERN RECOGNITION, PROCEEDINGS, 2003, 2781 : 362 - 369