Human-Robot Interaction Based on Gestures for Service Robots

被引:1
|
作者
de Sousa, Patrick [1 ]
Esteves, Tiago [2 ]
Campos, Daniel [2 ]
Duarte, Fabio [2 ]
Santos, Joana [2 ]
Leao, Joao [2 ]
Xavier, Jose [2 ]
de Matos, Luis [2 ]
Camarneiro, Manuel [2 ]
Penas, Marcelo [2 ]
Miranda, Maria [2 ]
Silva, Ricardo [2 ]
Neves, Antonio J. R. [3 ]
Teixeira, Luis [4 ]
机构
[1] Univ Porto, Fac Engn, Porto, Portugal
[2] Follow Inspirat, Fundao, Portugal
[3] Univ Aveiro, IEETA DETI, Aveiro, Portugal
[4] Univ Porto, Fac Engn, INESC TEC, Porto, Portugal
来源
VIPIMAGE 2017 | 2018年 / 27卷
关键词
D O I
10.1007/978-3-319-68195-5_76
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gesture recognition is very important for Human-Robot Interfaces. In this paper, we present a novel depth based method for gesture recognition to improve the interaction of a service robot autonomous shopping cart, mostly used by reduced mobility people. In the proposed solution, the identification of the user is already implemented by the software present on the robot where a bounding box focusing on the user is extracted. Based on the analysis of the depth histogram, the distance from the user to the robot is calculated and the user is segmented using from the background. Then, a region growing algorithm is applied to delete all other objects in the image. We apply again a threshold technique to the original image, to obtain all the objects in front of the user. Intercepting the threshold based segmentation result with the region growing resulting image, we obtain candidate objects to be arms of the user. By applying a labelling algorithm to obtain each object individually, a Principal Component Analysis is computed to each one to obtain its center and orientation. Using that information, we intercept the silhouette of the arm with a line obtaining the upper point of the interception which indicates the hand position. A Kalman filter is then applied to track the hand and based on state machines to describe gestures (Start, Stop, Pause) we perform gesture recognition. We tested the proposed approach in a real case scenario with different users and we obtained an accuracy around 89,7%.
引用
收藏
页码:700 / 709
页数:10
相关论文
共 50 条
  • [1] Pantomimic Gestures for Human-Robot Interaction
    Burke, Michael
    Lasenby, Joan
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (05) : 1225 - 1237
  • [2] Conversational Gestures in Human-Robot Interaction
    Bremner, Paul
    Pipe, Anthony
    Melhuish, Chris
    Fraser, Mike
    Subramanian, Sriram
    [J]. 2009 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2009), VOLS 1-9, 2009, : 1645 - +
  • [3] Pointing Gestures for Human-Robot Interaction in Service Robotics: A Feasibility Study
    Pozzi, Luca
    Gandolla, Marta
    Roveda, Loris
    [J]. COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS, ICCHP-AAATE 2022, PT II, 2022, : 461 - 468
  • [4] A cognitive approach to enhancing human-robot interaction for service robots
    Kim, Yo Chan
    Yoon, Wan Chul
    Kwon, Hyuk Tae
    Yoon, Young Sik
    Kim, Hyun Joong
    [J]. HUMAN INTERFACE AND THE MANAGEMENT OF INFORMATION: METHODS, TECHNIQUES AND TOOLS IN INFORMATION DESIGN, PT 1, PROCEEDINGS, 2007, 4557 : 858 - 867
  • [5] Social Human-Robot Interaction of Human-care Service Robots
    Ahn, Ho Seok
    Choi, JongSuk
    Moon, Hyungpil
    Lim, Yoonseob
    [J]. COMPANION OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'18), 2018, : 385 - 386
  • [6] Two Faces of Human-Robot Interaction: Field and Service Robots
    Ventura, Rodrigo
    [J]. NEW TRENDS IN MEDICAL AND SERVICE ROBOTS: CHALLENGES AND SOLUTIONS, 2014, 20 : 177 - 192
  • [7] Social Human-Robot Interaction of Human-care Service Robots
    Ahn, Ho Seok
    Choi, JongSuk
    Moon, Hyungpil
    Jang, Minsu
    Kwak, Sonya S.
    Lim, Yoonseob
    [J]. HRI '19: 2019 14TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2019, : 698 - 699
  • [8] Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation
    Yu, Mingxin
    Lin, Yingzi
    Schmidt, David
    Wang, Xiangzhou
    Wang, Yu
    [J]. JOURNAL OF EYE MOVEMENT RESEARCH, 2014, 7 (04):
  • [9] Analysis of Task-Based Gestures in Human-Robot Interaction
    Haddadi, Amir
    Croft, Elizabeth A.
    Gleeson, Brian T.
    MacLean, Karon
    Alcazar, Javier
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2013, : 2146 - 2152
  • [10] Human-Robot Interaction Using Pointing Gestures
    Tolgyessy, Michal
    Dekan, Martin
    Hubinsky, Peter
    [J]. ISCSIC'18: PROCEEDINGS OF THE 2ND INTERNATIONAL SYMPOSIUM ON COMPUTER SCIENCE AND INTELLIGENT CONTROL, 2018,