A gaze control of socially interactive robots in multiple-person interaction

被引:8
|
作者
Yun, Sang-Seok [1 ,2 ]
机构
[1] KIST, Seoul, South Korea
[2] Hwarangno 14 Gil 5, Seoul 02792, South Korea
关键词
Gaze control; Human-robot interaction (HRI); Non-verbal behavior; Robot attention; Multi-party; Socially interactive robots; SYSTEM;
D O I
10.1017/S0263574716000722
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
This paper proposes a computational model for selecting a suitable interlocutor of socially interactive robots in a situation interacting with multiple persons. To support this, a hybrid approach incorporating gaze control criteria and perceptual measurements for social cues is applied to the robot. For the perception part, representative non-verbal behaviors indicating human-interaction intent are designed based on the psychological analysis of human-human interaction, and these behavioral features are quantitatively measured by core perceptual components including visual, auditory, and spatial modalities. In addition, each aspect of recognition performance is improved through temporal confidence reasoning as a post-processing step. On the other hand, two factors of the physical space and conversational intimacy are tactically applied to the model calculation as a way of strengthening social gaze control effect of the robot. Interaction experiments with performance evaluation are given to verify that the proposed model is suitable to assess intended behaviors of individuals and perform gaze behavior about multiple persons. By showing a success rate of 93.3% in human decision-making criteria, it confirms a potential to establish socially acceptable gaze control in multiple-person interaction.
引用
收藏
页码:2122 / 2138
页数:17
相关论文
共 50 条
  • [1] Socially interactive robots
    Fong, T
    Nourbakhsh, I
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2003, 42 (3-4) : 139 - 141
  • [2] Finite-Time Interactive Control of Robots with Multiple Interaction Modes
    Yang, Jiantao
    Sun, Tairen
    [J]. SENSORS, 2022, 22 (10)
  • [3] A survey of socially interactive robots
    Fong, T
    Nourbakhsh, I
    Dautenhahn, K
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2003, 42 (3-4) : 143 - 166
  • [4] Multiple-person tracking system for content analysis
    Hsieh, JW
    Huang, YS
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2002, 16 (04) : 447 - 462
  • [5] Multiple-person tracking system for content analysis
    Hsieh, JW
    Huang, LW
    Huang, YS
    [J]. ADVANCES IN MUTLIMEDIA INFORMATION PROCESSING - PCM 2001, PROCEEDINGS, 2001, 2195 : 897 - 902
  • [6] A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots
    Mueller, Steffen
    Wengefeld, Tim
    Trinh, Thanh Quang
    Aganian, Dustin
    Eisenbach, Markus
    Gross, Horst-Michael
    [J]. SENSORS, 2020, 20 (03)
  • [7] A Satisfactory Coordination Solution of Multiple-Person Cooperative Game
    Xiong Guo-qiang
    Pan Quan
    Zhang Hong-cai
    Ji Ning
    [J]. 2007 INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS, NETWORKING AND MOBILE COMPUTING, VOLS 1-15, 2007, : 4326 - +
  • [8] Designing for Experiences with Socially Interactive Robots
    Obaid, Mohammad
    Ahtinen, Aino
    Kaipainen, Kirsikka
    Ocnarescu, Ioana
    [J]. NORDICHI'18: PROCEEDINGS OF THE 10TH NORDIC CONFERENCE ON HUMAN-COMPUTER INTERACTION, 2018, : 948 - 951
  • [9] Interactive Personalization for Socially Assistive Robots
    Clabaugh, Caitlyn E.
    [J]. COMPANION OF THE 2017 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'17), 2017, : 339 - 340
  • [10] Multiple-person tracker with a fixed slanting stereo camera
    Hayashi, K
    Hashimoto, M
    Sumi, K
    Sasakawa, K
    [J]. SIXTH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, PROCEEDINGS, 2004, : 681 - 686