Introducing CALMED: Multimodal Annotated Dataset for Emotion Detection in Children with Autism

被引:1
|
作者
Sousa, Annanda [1 ]
Young, Karen [1 ]
d'Aquin, Mathieu [2 ]
Zarrouk, Manel [3 ]
Holloway, Jennifer [4 ]
机构
[1] Univ Galway, Galway, Ireland
[2] LORIA CNRS INRIA Univ Lorraine, K Team, Nancy, France
[3] Univ Sorbonne Paris Nord, LIPN, Villetaneuse, France
[4] ASK All Special Kids, Geneva, Switzerland
基金
爱尔兰科学基金会;
关键词
Affective Computing; Multimodal Emotion Detection; Multimodal Dataset; Autism; HIGH-FUNCTIONING AUTISM; SPECTRUM; RECOGNITION;
D O I
10.1007/978-3-031-35681-0_43
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Automatic Emotion Detection (ED) aims to build systems to identify users' emotions automatically. This field has the potential to enhance HCI, creating an individualised experience for the user. However, ED systems tend to perform poorly on people with Autism Spectrum Disorder (ASD). Hence, the need to create ED systems tailored to how people with autism express emotions. Previous works have created ED systems tailored for children with ASD but did not share the resulting dataset. Sharing annotated datasets is essential to enable the development of more advanced computer models for ED within the research community. In this paper, we describe our experience establishing a process to create a multimodal annotated dataset featuring children with a level 1 diagnosis of autism. In addition, we introduce CALMED (Children, Autism, Multimodal, Emotion, Detection), the resulting multimodal emotion detection dataset featuring children with autism aged 8-12. CALMED includes audio and video features extracted from recording files of study sessions with participants, together with annotations provided by their parents into four target classes. The generated dataset includes a total of 57,012 examples, with each example representing a time window of 200 ms (0.2 s). Our experience and methods described here, together with the dataset shared, aim to contribute to future research applications of affective computing in ASD, which has the potential to create systems to improve the lives of people with ASD.
引用
收藏
页码:657 / 677
页数:21
相关论文
共 50 条
  • [21] Evaluating Gaze Detection for Children with Autism Using the ChildPlay-R Dataset
    Boluk, Nursena
    Kose, Hatice
    2024 IEEE 18TH INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, FG 2024, 2024,
  • [22] What multimodal components, tools, dataset and focus of emotion are used in the current research of multimodal emotion: a systematic literature review
    Rahmalina, Reny
    Gunawan, Wawan
    COGENT SOCIAL SCIENCES, 2024, 10 (01):
  • [23] Crawling to Improve Multimodal Emotion Detection
    Cueva, Diego R.
    Goncalves, Rafael A. M.
    Cozman, Fabio
    Pereira-Barretto, Marcos R.
    ADVANCES IN SOFT COMPUTING, PT II, 2011, 7095 : 343 - 350
  • [24] RaspberrySet: Dataset of Annotated Raspberry Images for Object Detection
    Strautina, Sarmite
    Kalnina, Ieva
    Kaufmane, Edite
    Sudars, Kaspars
    Namatevs, Ivars
    Nikulins, Arturs
    Edelmers, Edgars
    DATA, 2023, 8 (05)
  • [25] Photo Sequences of Varying Emotion: Optimization with a Valence-Arousal Annotated Dataset
    Mousas C.
    Krogmeier C.
    Wang Z.
    ACM Transactions on Interactive Intelligent Systems, 2021, 11 (02)
  • [26] A multimodal approach to emotion recognition ability in autism spectrum disorders
    Jones, Catherine R. G.
    Pickles, Andrew
    Falcaro, Milena
    Marsden, Anita J. S.
    Happe, Francesca
    Scott, Sophie K.
    Sauter, Disa
    Tregay, Jenifer
    Phillips, Rebecca J.
    Baird, Gillian
    Simonoff, Emily
    Charman, Tony
    JOURNAL OF CHILD PSYCHOLOGY AND PSYCHIATRY, 2011, 52 (03) : 275 - 285
  • [27] MultimodalGasData: Multimodal Dataset for Gas Detection and Classification
    Narkhede, Parag
    Walambe, Rahee
    Chandel, Pulkit
    Mandaokar, Shruti
    Kotecha, Ketan
    DATA, 2022, 7 (08)
  • [28] Emotion-Aware Multimodal Fusion for Meme Emotion Detection
    Sharma, Shivam
    Ramaneswaran, S.
    Akhtar, Md. Shad
    Chakraborty, Tanmoy
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2024, 15 (03) : 1800 - 1811
  • [29] A Multimodal Handover Failure Detection Dataset and Baselines
    Thoduka, Santosh
    Hochgeschwender, Nico
    Ga, Juergen
    Ploeger, Paul G.
    2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2024), 2024, : 17013 - 17019
  • [30] HEADSET: Human Emotion Awareness under Partial Occlusions Multimodal DataSET
    Lohesara, Fatemeh Ghorbani
    Freitas, Davi Rabbouni
    Guillemot, Christine
    Eguiazarian, Karen
    Knorr, Sebastian
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2023, 29 (11) : 4686 - 4696