Affect in Multimodal Information

被引:9
|
作者
Esposito, Anna [1 ]
机构
[1] Univ Naples 2, Dipartimento Psicol, I-81100 Caserta, Italy
关键词
NONVERBAL-COMMUNICATION; VOCAL COMMUNICATION; FACIAL EXPRESSIONS; FEATURE-EXTRACTION; EMOTION; SPEECH; FACE; RECOGNITION; PERCEPTION; ANIMATION;
D O I
10.1007/978-1-84800-306-4_12
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In face-to-face communication, file emotional state of the speaker is transmitted to the listener through a synthetic process that involves both the verbal and the nonverbal modalities of communication. From this point of view, the transmission of the information content is redundant, because the same information is transferred through several channels as well. How much information about the speaker's emotional state is transmitted by each channel and which channel plays the major role in transferring such information? The present study tries to answer these questions through a perceptual experiment that evaluates the subjective perception of emotional states through the single (either visual or auditory channel) and the combined channels (visual and auditory). Results seem to show that, taken separately, the semantic content of the message and the visual content of the message carry the same amount of information as the combined channels, suggesting that each channel performs a robust encoding of the emotional features that is very helpful in recovering the perception of the emotional state when one of file channels is degraded by noise.
引用
收藏
页码:203 / 226
页数:24
相关论文
共 50 条
  • [1] Simplified classification in multimodal affect detection using vocal and facial information
    Wei, Yunyun
    Sun, Xiangran
    INFORMATION TECHNOLOGY, 2015, : 331 - 336
  • [2] Multimodal Affect and Aesthetic Experience
    Kostoulas, Theodoros
    Muszynski, Michal
    Tian, Leimin
    Roman-Rangel, Edgar
    Chaspari, Theodora
    Amelidis, Panos
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, ICMI 2022, 2022, : 797 - 798
  • [3] A Multimodal Theory of Affect Diffusion
    Peters, Kim
    Kashima, Yoshihisa
    PSYCHOLOGICAL BULLETIN, 2015, 141 (05) : 966 - 992
  • [4] Multimodal Sensing of Affect Intensity
    Bhatia, Shalini
    ICMI'16: PROCEEDINGS OF THE 18TH ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2016, : 567 - 571
  • [5] Multimodal information exploration
    Stock, O
    Strapparava, C
    Zancanaro, M
    JOURNAL OF EDUCATIONAL COMPUTING RESEARCH, 1997, 17 (03) : 277 - 295
  • [6] MULTIMODAL AFFECT DETECTION OF CAR DRIVERS
    Rothkrantz, Leon J. M.
    Datcu, Dragos
    Absil, Neil
    NEURAL NETWORK WORLD, 2009, 19 (03) : 293 - 305
  • [7] Fusion Mappings for Multimodal Affect Recognition
    Kaechele, Markus
    Schels, Martin
    Thiam, Patrick
    Schwenker, Friedhelm
    2015 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2015, : 307 - 313
  • [8] Multimodal Affect Recognition in Virtual Worlds: Avatars Mirroring Users' Affect
    Gonzalez-Sanchez, Javier
    Chavez-Echeagaray, Maria Elena
    Gibson, David
    Atkinson, Robert
    2013 HUMAINE ASSOCIATION CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2013, : 724 - +
  • [9] A model for multimodal information retrieval
    Srihari, RK
    Rao, AB
    Han, B
    Munirathnam, S
    Wu, XY
    2000 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, PROCEEDINGS VOLS I-III, 2000, : 701 - 704
  • [10] TMS for multimodal information processing
    Barricelli, Barbara Rita
    Mussio, Piero
    Padula, Marco
    Scala, Paolo Luigi
    MULTIMEDIA TOOLS AND APPLICATIONS, 2011, 54 (01) : 97 - 120