The uulmMAC Database-A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction

被引:20
|
作者
Hazer-Rau, Dilana [1 ]
Meudt, Sascha [2 ]
Daucher, Andreas [1 ]
Spohrs, Jennifer [1 ]
Hoffmann, Holger [1 ]
Schwenker, Friedhelm [2 ]
Traue, Harald C. [1 ]
机构
[1] Univ Ulm, Sect Med Psychol, Frauensteige 6, D-89075 Ulm, Germany
[2] Univ Ulm, Inst Neural Informat Proc, D-89081 Ulm, Germany
关键词
affective corpus; multimodal sensors; overload; underload; interest; frustration; cognitive load; emotion recognition; stress research; affective computing; machine learning; human-computer interaction; COGNITIVE LOAD; MENTAL WORKLOAD; EMOTION; QUESTIONNAIRE; TECHNOLOGIES;
D O I
10.3390/s20082308
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload, and Frustration. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC), consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 x video, 3 x audio, and 7 x biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.
引用
收藏
页数:33
相关论文
共 50 条
  • [1] The effects of affective interventions in human-computer interaction
    Partala, T
    Surakka, V
    [J]. INTERACTING WITH COMPUTERS, 2004, 16 (02) : 295 - 309
  • [2] Affective and emotional aspects of human-computer interaction
    Cannell, Pete
    [J]. BRITISH JOURNAL OF EDUCATIONAL TECHNOLOGY, 2008, 39 (01) : 188 - 188
  • [3] RAMAS: Russian Multimodal Corpus of Dyadic Interaction for Affective Computing
    Perepelkina, Olga
    Kazimirova, Evdokia
    Konstantinova, Maria
    [J]. SPEECH AND COMPUTER (SPECOM 2018), 2018, 11096 : 501 - 510
  • [4] THE RESEARCH OF HUMAN-COMPUTER INTERACTION BY COMBINING AFFECTIVE COMPUTING INTO CHINESE CALLIGRAPHY ART
    Chen, Yu-Chen
    Wang, Chao-Ming
    [J]. Proceedings of the 20th International Conference on Computer-Aided Architectural Design Research in Asia (CAADRIA 2015): EMERGING EXPERIENCES IN THE PAST, PRESENT AND FUTURE OF DIGITAL ARCHITECTURE, 2015, : 55 - 64
  • [5] SYSTEM PERSONALITY AND ADAPTIVITY IN AFFECTIVE HUMAN-COMPUTER INTERACTION
    Konstantopoulos, Stasinos
    Karkaletsis, Vangelis
    [J]. INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2013, 22 (02)
  • [6] Research on Human-Computer Affective Interaction for Disabled Students
    Yang Zhixiao
    Fan Yanfeng
    [J]. 2011 SECOND INTERNATIONAL CONFERENCE ON EDUCATION AND SPORTS EDUCATION (ESE 2011), VOL V, 2011, : 17 - 20
  • [7] Incorporating Affective Computing Into an Interactive System With MakeyMakey: An Emotional Human-Computer Interaction Design
    Liu Hsin Lan
    Lin Hao-Chiang Koong
    Liang Yu-Chen
    Zeng Yu-cheng
    Zhan Kai-cheng
    Liu Hsin-Yueh
    [J]. INTERNATIONAL JOURNAL OF ONLINE PEDAGOGY AND COURSE DESIGN, 2022, 12 (01)
  • [8] Expressive Gibberish Speech Synthesis for Affective Human-Computer Interaction
    Yilmazyildiz, Selma
    Latacz, Lukas
    Mattheyses, Wesley
    Verhelst, Werner
    [J]. TEXT, SPEECH AND DIALOGUE, 2010, 6231 : 584 - 590
  • [9] Multimodal human-computer interaction
    Turk, M
    [J]. REAL-TIME VISION FOR HUMAN-COMPUTER INTERACTION, 2005, : 269 - 283
  • [10] Knowledge engineering for affective bi-modal human-computer interaction
    Alepis, Efthymios
    Virvou, Maria
    Kabassi, Katerina
    [J]. SIGMAP 2007: PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND MULTIMEDIA APPLICATIONS, 2007, : 222 - +