User-centered control of audio and visual expressive feedback by full-body movements

被引:0
|
作者
Castellano, Ginevra [1 ]
Bresin, Roberto [2 ]
Camurri, Antonio [1 ]
Volpe, Gualtiero [1 ]
机构
[1] DIST Univ Genoa, InfoMus Lab, Viale Causa 13, I-16145 Genoa, Italy
[2] KTH, Dept Speech Music & Hearing, CSC Sch Comp Sci & Commun, Stockholm, Sweden
关键词
affective interaction; expressive gesture; multimodal environments; interactive music systems;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we describe a system allowing users to express themselves through their full-body movement and gesture and to control in real-time the generation of an audio-visual feedback. The systems analyses in real-time the user's full-body movement and gesture, extracts expressive motion features and maps the values of the expressive motion features onto real-time control of acoustic parameters for rendering a music performance. At the same time, a visual feedback generated in real-time is projected on a screen in front of the users with their coloured silhouette, depending on the emotion their movement communicates. Human movement analysis and visual feedback generation were done with the EyesWeb software platform and the music performance rendering with pDM. Evaluation tests were done with human participants to test the usability of the interface and the effectiveness of the design.
引用
收藏
页码:501 / +
页数:3
相关论文
共 50 条
  • [1] User-centered development and performance assessment of a modular full-body exoskeleton (AXO-SUIT)
    Bai, Shaoping
    Islam, M. R.
    Power, Valerie
    OSullivan, Leonard
    BIOMIMETIC INTELLIGENCE AND ROBOTICS, 2024, 2 (02):
  • [2] Emotion matters: Different psychophysiological responses to expressive and non-expressive full-body movements
    Christensen, Julia F.
    Azevedo, Ruben T.
    Tsakiris, Manos
    ACTA PSYCHOLOGICA, 2021, 212
  • [3] User-Centered Evaluation of Visual Analytics
    Scholtz, Jean
    Synthesis Lectures on Visualization, 2018, 5 (01): : 1 - 85
  • [4] Real-time modulation of visual feedback on human full-body movements in a virtual mirror: development and proof-of-concept
    Meyke Roosink
    Nicolas Robitaille
    Bradford J McFadyen
    Luc J Hébert
    Philip L Jackson
    Laurent J Bouyer
    Catherine Mercier
    Journal of NeuroEngineering and Rehabilitation, 12
  • [5] Real-time modulation of visual feedback on human full-body movements in a virtual mirror: development and proof-of-concept
    Roosink, Meyke
    Robitaille, Nicolas
    McFadyen, Bradford J.
    Hebert, Luc J.
    Jackson, Philip L.
    Bouyer, Laurent J.
    Mercier, Catherine
    JOURNAL OF NEUROENGINEERING AND REHABILITATION, 2015, 12
  • [6] AvatarReX: Real-time Expressive Full-body Avatars
    Zheng, Zerong
    Zhao, Xiaochen
    Zhang, Hongwen
    Liu, Boning
    Liu, Yebin
    ACM TRANSACTIONS ON GRAPHICS, 2023, 42 (04):
  • [7] Full-body gestures and movements recognition: user descriptive and unsupervised learning approaches in GDL classifier
    Hachaj, Tomasz
    Ogiela, Marek R.
    APPLICATIONS OF DIGITAL IMAGE PROCESSING XXXVII, 2014, 9217
  • [8] TriBERT: Full-body Human-centric Audio-visual Representation Learning for Visual Sound Separation
    Rahman, Tanzila
    Yang, Mengyu
    Sigal, Leonid
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [9] A user-centered methodology to generate visual modeling environments
    Costagliola, Gennaro
    Deufemia, Vincenzo
    Ferrucci, Filomena
    Gravino, Carmine
    ENTERPRISE INFORMATION SYSTEMS VI, 2006, : 219 - +
  • [10] Programming full-body movements for humanoid robots by observation
    Ude, A
    Atkeson, CG
    Riley, M
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2004, 47 (2-3) : 93 - 108