Pantomimic Gestures for Human-Robot Interaction

被引:13
|
作者
Burke, Michael [1 ,2 ]
Lasenby, Joan [1 ]
机构
[1] Univ Cambridge, Dept Engn, Signal Proc Grp, Cambridge CB2 1PZ, England
[2] CSIR, Mobile Intelligent Autonomous Syst Modelling & Di, ZA-0001 Pretoria, South Africa
关键词
Gesture recognition; human-robot interaction; pantomimic; principal component analysis (PCA); time series classification; TIME-SERIES; RECOGNITION; CLASSIFICATION; TRACKING;
D O I
10.1109/TRO.2015.2475956
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
This paper introduces a pantomimic gesture interface, which classifies human hand gestures using unmanned aerial vehicle (UAV) behavior recordings as training data. We argue that pantomimic gestures are more intuitive than iconic gestures and show that a pantomimic gesture recognition strategy using micro-UAV behavior recordings can be more robust than one trained directly using hand gestures. Hand gestures are isolated by applying a maximum information criterion, with features extracted using principal component analysis and compared using a nearest neighbor classifier. These features are biased in that they are better suited to classifying certain behaviors. We show how a Bayesian update step accounting for the geometry of training features compensates for this, resulting in fairer classification results, and introduce a weighted voting system to aid in sequence labeling.
引用
收藏
页码:1225 / 1237
页数:13
相关论文
共 50 条
  • [1] Conversational Gestures in Human-Robot Interaction
    Bremner, Paul
    Pipe, Anthony
    Melhuish, Chris
    Fraser, Mike
    Subramanian, Sriram
    [J]. 2009 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2009), VOLS 1-9, 2009, : 1645 - +
  • [2] Human-Robot Interaction Using Pointing Gestures
    Tolgyessy, Michal
    Dekan, Martin
    Hubinsky, Peter
    [J]. ISCSIC'18: PROCEEDINGS OF THE 2ND INTERNATIONAL SYMPOSIUM ON COMPUTER SCIENCE AND INTELLIGENT CONTROL, 2018,
  • [3] Integration of Gestures and Speech in Human-Robot Interaction
    Meena, Raveesh
    Jokinen, Kristiina
    Wilcock, Graham
    [J]. 3RD IEEE INTERNATIONAL CONFERENCE ON COGNITIVE INFOCOMMUNICATIONS (COGINFOCOM 2012), 2012, : 673 - 678
  • [4] Incremental learning of gestures for human-robot interaction
    Okada, Shogo
    Kobayashi, Yoichi
    Ishibashi, Satoshi
    Nishida, Toyoaki
    [J]. AI & SOCIETY, 2010, 25 (02) : 155 - 168
  • [5] Pointing Gestures for Human-Robot Interaction with the Humanoid Robot Digit
    Lorentz, Viktor
    Weiss, Manuel
    Hildebrand, Kristian
    Boblan, Ivo
    [J]. 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 1886 - 1892
  • [6] Visual recognition of pointing gestures for human-robot interaction
    Nickel, Kai
    Stiefelhagen, Rainer
    [J]. IMAGE AND VISION COMPUTING, 2007, 25 (12) : 1875 - 1884
  • [7] Recognizing Touch Gestures for Social Human-Robot Interaction
    Altuglu, Tugce Balli
    Altun, Kerem
    [J]. ICMI'15: PROCEEDINGS OF THE 2015 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2015, : 407 - 413
  • [8] Understanding and learning of gestures through human-robot interaction
    Kuno, Y
    Murashima, T
    Shimada, N
    Shirai, Y
    [J]. 2000 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2000), VOLS 1-3, PROCEEDINGS, 2000, : 2133 - 2138
  • [9] Human-Robot Interaction Based on Gestures for Service Robots
    de Sousa, Patrick
    Esteves, Tiago
    Campos, Daniel
    Duarte, Fabio
    Santos, Joana
    Leao, Joao
    Xavier, Jose
    de Matos, Luis
    Camarneiro, Manuel
    Penas, Marcelo
    Miranda, Maria
    Silva, Ricardo
    Neves, Antonio J. R.
    Teixeira, Luis
    [J]. VIPIMAGE 2017, 2018, 27 : 700 - 709
  • [10] Human-Robot Interaction by Understanding Upper Body Gestures
    Xiao, Yang
    Zhang, Zhijun
    Beck, Aryel
    Yuan, Junsong
    Thalmann, Daniel
    [J]. PRESENCE-VIRTUAL AND AUGMENTED REALITY, 2014, 23 (02): : 133 - 154