Endowing a NAO Robot With Practical Social-Touch Perception

被引:7
|
作者
Burns, Rachael Bevill [1 ]
Lee, Hyosang [1 ,2 ]
Seifi, Hasti [1 ,3 ]
Faulkner, Robert [1 ]
Kuchenbecker, Katherine J. [1 ]
机构
[1] Max Planck Inst Intelligent Syst, Hapt Intelligence Dept, Stuttgart, Germany
[2] Univ Stuttgart, Inst Smart Sensors, Dept Elect Engn, Stuttgart, Germany
[3] Univ Copenhagen, Human Ctr Comp, Dept Comp Sci, Copenhagen, Denmark
来源
关键词
human-robot interaction; socially assistive robotics; social touch; affective touch; tactile sensors; gesture classification; TACTILE; RECOGNITION; CHILDREN; EMOTION; CARE;
D O I
10.3389/frobt.2022.840335
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Social touch is essential to everyday interactions, but current socially assistive robots have limited touch-perception capabilities. Rather than build entirely new robotic systems, we propose to augment existing rigid-bodied robots with an external touch-perception system. This practical approach can enable researchers and caregivers to continue to use robotic technology they have already purchased and learned about, but with a myriad of new social-touch interactions possible. This paper presents a low-cost, easy-to-build, soft tactile-perception system that we created for the NAO robot, as well as participants' feedback on touching this system. We installed four of our fabric-and-foam-based resistive sensors on the curved surfaces of a NAO's left arm, including its hand, lower arm, upper arm, and shoulder. Fifteen adults then performed five types of affective touch-communication gestures (hitting, poking, squeezing, stroking, and tickling) at two force intensities (gentle and energetic) on the four sensor locations; we share this dataset of four time-varying resistances, our sensor patterns, and a characterization of the sensors' physical performance. After training, a gesture-classification algorithm based on a random forest identified the correct combined touch gesture and force intensity on windows of held-out test data with an average accuracy of 74.1%, which is more than eight times better than chance. Participants rated the sensor-equipped arm as pleasant to touch and liked the robot's presence significantly more after touch interactions. Our promising results show that this type of tactile-perception system can detect necessary social-touch communication cues from users, can be tailored to a variety of robot body parts, and can provide HRI researchers with the tools needed to implement social touch in their own systems.
引用
收藏
页数:17
相关论文
共 50 条
  • [31] Contextual Collision: The Social Consequences of Touch and Collision in Human Robot Interaction
    Shutterly, Alison
    Menguc, Yigit
    Knight, Heather
    [J]. COMPANION OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'18), 2018, : 323 - 324
  • [32] An Online Survey on the Perception of Mediated Social Touch Interaction and Device Design
    Rognon, Carine
    Bunge, Taylor
    Gao, Meiyuzi
    Conor, Chip
    Stephens-Fripp, Benjamin
    Brown, Casey
    Israr, Ali
    [J]. IEEE TRANSACTIONS ON HAPTICS, 2022, 15 (02) : 372 - 381
  • [33] Out of touch with reality? Social perception in first-episode schizophrenia
    Ebisch, Sjoerd J. H.
    Salone, Anatolia
    Ferri, Francesca
    De Berardis, Domenico
    Romani, Gian Luca
    Ferro, Filippo M.
    Gallese, Vittorio
    [J]. SOCIAL COGNITIVE AND AFFECTIVE NEUROSCIENCE, 2013, 8 (04) : 394 - 403
  • [34] Children with autism social engagement in interaction with Nao, an imitative robot A series of single case experiments
    Tapus, Adriana
    Peca, Andreea
    Aly, Amir
    Pop, Cristina
    Jisa, Lavinia
    Pintea, Sebastian
    Rusu, Alina S.
    David, Daniel O.
    [J]. INTERACTION STUDIES, 2012, 13 (03) : 315 - 347
  • [35] The Effect of Use of Social Robot NAO on Children's Motivation and Emotional States in Special Education
    Namlisesli, Deniz
    Bas, Hale Nur
    Bostanci, Hilal
    Coskun, Buket
    Barkana, Duygun Erol
    Tarakci, Devrim
    [J]. 2024 21ST INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS, UR 2024, 2024, : 7 - 12
  • [36] Group Emotion Detection Based on Social Robot Perception
    Quiroz, Marco
    Patino, Raquel
    Diaz-Amado, Jose
    Cardinale, Yudith
    [J]. SENSORS, 2022, 22 (10)
  • [37] Interacting with a Social Robot Affects Visual Perception of Space
    Mazzola, Carlo
    Aroyo, Alexander Mois
    Rea, Francesco
    Sciutti, Alessandra
    [J]. PROCEEDINGS OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI '20), 2020, : 549 - 557
  • [38] NAO Robot as Experimenter: Social Cues Emitter and Neutralizer to Bring new Results in Experimental Psychology
    Masson, Oliveir
    Baratgin, Jean
    Jamet, Frank
    [J]. 2017 INTERNATIONAL CONFERENCE ON INFORMATION AND DIGITAL TECHNOLOGIES (IDT), 2017, : 256 - 264
  • [39] Evaluating Social Perception of Human-to-Robot Handovers Using the Robot Social Attributes Scale (RoSAS)
    Pan, Matthew K. X. J.
    Croft, Elizabeth A.
    Niemeyer, Gunter
    [J]. HRI '18: PROCEEDINGS OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2018, : 443 - 451
  • [40] Affectionate Interaction with a Small Humanoid Robot Capable of Recognizing Social Touch Behavior
    Cooney, Martin
    Nishio, Shuichi
    Ishiguro, Hiroshi
    [J]. ACM TRANSACTIONS ON INTERACTIVE INTELLIGENT SYSTEMS, 2015, 4 (04)