Affect in Multimodal Information

被引:9
|
作者
Esposito, Anna [1 ]
机构
[1] Univ Naples 2, Dipartimento Psicol, I-81100 Caserta, Italy
关键词
NONVERBAL-COMMUNICATION; VOCAL COMMUNICATION; FACIAL EXPRESSIONS; FEATURE-EXTRACTION; EMOTION; SPEECH; FACE; RECOGNITION; PERCEPTION; ANIMATION;
D O I
10.1007/978-1-84800-306-4_12
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In face-to-face communication, file emotional state of the speaker is transmitted to the listener through a synthetic process that involves both the verbal and the nonverbal modalities of communication. From this point of view, the transmission of the information content is redundant, because the same information is transferred through several channels as well. How much information about the speaker's emotional state is transmitted by each channel and which channel plays the major role in transferring such information? The present study tries to answer these questions through a perceptual experiment that evaluates the subjective perception of emotional states through the single (either visual or auditory channel) and the combined channels (visual and auditory). Results seem to show that, taken separately, the semantic content of the message and the visual content of the message carry the same amount of information as the combined channels, suggesting that each channel performs a robust encoding of the emotional features that is very helpful in recovering the perception of the emotional state when one of file channels is degraded by noise.
引用
收藏
页码:203 / 226
页数:24
相关论文
共 50 条
  • [21] Group Affect Prediction Using Multimodal Distributions
    Shamsi, Saqib Nizam
    Singh, Bhanu Pratap
    Wadhwa, Manya
    2018 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WORKSHOPS (WACVW 2018), 2018, : 77 - 83
  • [22] Multimodal Affect Classification at Various Temporal Lengths
    Kim, Jonathan C.
    Clements, Mark A.
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2015, 6 (04) : 371 - 384
  • [23] Online Affect Tracking with Multimodal Kalman Filters
    Somandepalli, Krishna
    Gupta, Rahul
    Nasir, Md
    Booth, Brandon M.
    Lee, Sungbok
    Narayanan, Shrikanth S.
    PROCEEDINGS OF THE 6TH INTERNATIONAL WORKSHOP ON AUDIO/VISUAL EMOTION CHALLENGE (AVEC'16), 2016, : 59 - 66
  • [24] Communicating expressiveness and affect in multimodal interactive systems
    Camurri, A
    Volpe, G
    De Poli, G
    Leman, M
    IEEE MULTIMEDIA, 2005, 12 (01) : 43 - 53
  • [25] Multimodal Framework for Analyzing the Affect of a Group of People
    Huang, Xiaohua
    Dhall, Abhinav
    Goecke, Roland
    Pietikainen, Matti
    Zhao, Guoying
    IEEE TRANSACTIONS ON MULTIMEDIA, 2018, 20 (10) : 2706 - 2721
  • [26] Multimodal Affect Recognition Using Boltzmann Zippers
    Lu, Kun
    Zhang, Xin
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2013, E96D (11): : 2496 - 2499
  • [27] A Multimodal Database for Affect Recognition and Implicit Tagging
    Soleymani, Mohammad
    Lichtenauer, Jeroen
    Pun, Thierry
    Pantic, Maja
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2012, 3 (01) : 42 - 55
  • [28] Affect Recognition for Multimodal Natural Language Processing
    Poria, Soujanya
    Soon, Ong Yew
    Liu, Bing
    Bing, Lidong
    COGNITIVE COMPUTATION, 2021, 13 (02) : 229 - 230
  • [29] Mothers' multimodal information processing is modulated by multimodal interactions with their infants
    Tanaka, Yukari
    Fukushima, Hirokata
    Okanoya, Kazuo
    Myowa-Yamakoshi, Masako
    INTERNATIONAL JOURNAL OF PSYCHOPHYSIOLOGY, 2014, 94 (02) : 174 - 174
  • [30] Affect as embodied information
    Clore, GL
    Tamir, M
    PSYCHOLOGICAL INQUIRY, 2002, 13 (01) : 37 - 45