Introducing CALMED: Multimodal Annotated Dataset for Emotion Detection in Children with Autism

被引:1
|
作者
Sousa, Annanda [1 ]
Young, Karen [1 ]
d'Aquin, Mathieu [2 ]
Zarrouk, Manel [3 ]
Holloway, Jennifer [4 ]
机构
[1] Univ Galway, Galway, Ireland
[2] LORIA CNRS INRIA Univ Lorraine, K Team, Nancy, France
[3] Univ Sorbonne Paris Nord, LIPN, Villetaneuse, France
[4] ASK All Special Kids, Geneva, Switzerland
基金
爱尔兰科学基金会;
关键词
Affective Computing; Multimodal Emotion Detection; Multimodal Dataset; Autism; HIGH-FUNCTIONING AUTISM; SPECTRUM; RECOGNITION;
D O I
10.1007/978-3-031-35681-0_43
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Automatic Emotion Detection (ED) aims to build systems to identify users' emotions automatically. This field has the potential to enhance HCI, creating an individualised experience for the user. However, ED systems tend to perform poorly on people with Autism Spectrum Disorder (ASD). Hence, the need to create ED systems tailored to how people with autism express emotions. Previous works have created ED systems tailored for children with ASD but did not share the resulting dataset. Sharing annotated datasets is essential to enable the development of more advanced computer models for ED within the research community. In this paper, we describe our experience establishing a process to create a multimodal annotated dataset featuring children with a level 1 diagnosis of autism. In addition, we introduce CALMED (Children, Autism, Multimodal, Emotion, Detection), the resulting multimodal emotion detection dataset featuring children with autism aged 8-12. CALMED includes audio and video features extracted from recording files of study sessions with participants, together with annotations provided by their parents into four target classes. The generated dataset includes a total of 57,012 examples, with each example representing a time window of 200 ms (0.2 s). Our experience and methods described here, together with the dataset shared, aim to contribute to future research applications of affective computing in ASD, which has the potential to create systems to improve the lives of people with ASD.
引用
收藏
页码:657 / 677
页数:21
相关论文
共 50 条
  • [1] Introducing WESAD, a Multimodal Dataset for Wearable Stress and Affect Detection
    Schmidt, Philip
    Reiss, Attila
    Duerichen, Robert
    Marberger, Claus
    Van Laerhoven, Kristof
    ICMI'18: PROCEEDINGS OF THE 20TH ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2018, : 400 - 408
  • [2] A novel multimodal EEG-image fusion approach for emotion recognition: introducing a multimodal KMED dataset
    Bahar Hatipoglu Yilmaz
    Cemal Kose
    Cagatay Murat Yilmaz
    Neural Computing and Applications, 2025, 37 (6) : 5187 - 5202
  • [3] EMOTION DETECTION AND SEMANTICS OF EMOTIVES: DISTRESS AND ANGER IN ANNOTATED TEXT DATASET
    Kolmogorova, Anastasia, V
    PHILOLOGICAL CLASS, 2021, 26 (02): : 78 - 89
  • [4] A Multimodal Dataset for Mixed Emotion Recognition
    Yang, Pei
    Liu, Niqi
    Liu, Xinge
    Shu, Yezhi
    Ji, Wenqi
    Ren, Ziqi
    Sheng, Jenny
    Yu, Minjing
    Yi, Ran
    Zhang, Dan
    Liu, Yong-Jin
    SCIENTIFIC DATA, 2024, 11 (01)
  • [5] MuSE: a Multimodal Dataset of Stressed Emotion
    Jaiswal, Mimansa
    Bara, Cristian-Paul
    Luo, Yuanhang
    Burzo, Mihai
    Mihalcea, Rada
    Provost, Emily Mower
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 1499 - 1510
  • [6] Multimodal Emotion Recognition for Children with Autism Spectrum Disorder in Social Interaction
    Liu, Jingjing
    Wang, Zhiyong
    Nie, Wei
    Zeng, Jia
    Zhou, Bingrui
    Deng, Jingxin
    Li, Huiping
    Xu, Qiong
    Xu, Xiu
    Liu, Honghai
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2024, 40 (08) : 1921 - 1930
  • [7] INTRODUCING A MULTIMODAL DATASET FOR THE RESEARCH OF ARCHITECTURAL ELEMENTS
    Bruschke, J.
    Kroeber, C.
    Maiwald, F.
    Utescher, R.
    Pattee, A.
    29TH CIPA SYMPOSIUM DOCUMENTING, UNDERSTANDING, PRESERVING CULTURAL HERITAGE. HUMANITIES AND DIGITAL TECHNOLOGIES FOR SHAPING THE FUTURE, VOL. 48-M-2, 2023, : 325 - 331
  • [8] MEmoR: A Dataset for Multimodal Emotion Reasoning in Videos
    Shen, Guangyao
    Wang, Xin
    Duan, Xuguang
    Li, Hongzhi
    Zhu, Wenwu
    MM '20: PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, 2020, : 493 - 502
  • [9] Multimodal Emotion Perception in Children with Autism Spectrum Disorder by Eye Tracking Study
    Su, Qi
    Chen, Fei
    Li, Hanfei
    Yan, Nan
    Wang, Lan
    2018 IEEE-EMBS CONFERENCE ON BIOMEDICAL ENGINEERING AND SCIENCES (IECBES), 2018, : 382 - 387
  • [10] MMASD: A Multimodal Dataset for Autism Intervention Analysis
    Li, Jicheng
    Chheang, Vuthea
    Kullu, Pinar
    Brignac, Eli
    Guo, Zhang
    Barner, Kenneth E.
    Bhat, Anjana
    Barmaki, Roghayeh Leila
    PROCEEDINGS OF THE 25TH INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, ICMI 2023, 2023, : 397 - 405