Pantomimic Gestures for Human-Robot Interaction

被引:13
|
作者
Burke, Michael [1 ,2 ]
Lasenby, Joan [1 ]
机构
[1] Univ Cambridge, Dept Engn, Signal Proc Grp, Cambridge CB2 1PZ, England
[2] CSIR, Mobile Intelligent Autonomous Syst Modelling & Di, ZA-0001 Pretoria, South Africa
关键词
Gesture recognition; human-robot interaction; pantomimic; principal component analysis (PCA); time series classification; TIME-SERIES; RECOGNITION; CLASSIFICATION; TRACKING;
D O I
10.1109/TRO.2015.2475956
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
This paper introduces a pantomimic gesture interface, which classifies human hand gestures using unmanned aerial vehicle (UAV) behavior recordings as training data. We argue that pantomimic gestures are more intuitive than iconic gestures and show that a pantomimic gesture recognition strategy using micro-UAV behavior recordings can be more robust than one trained directly using hand gestures. Hand gestures are isolated by applying a maximum information criterion, with features extracted using principal component analysis and compared using a nearest neighbor classifier. These features are biased in that they are better suited to classifying certain behaviors. We show how a Bayesian update step accounting for the geometry of training features compensates for this, resulting in fairer classification results, and introduce a weighted voting system to aid in sequence labeling.
引用
收藏
页码:1225 / 1237
页数:13
相关论文
共 50 条
  • [21] Learning, Generating and Adapting Wave Gestures for Expressive Human-Robot Interaction
    Panteris, Michail
    Manschitz, Simon
    Calinon, Sylvain
    [J]. HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2020, : 386 - 388
  • [22] Pointing Gestures for Human-Robot Interaction in Service Robotics: A Feasibility Study
    Pozzi, Luca
    Gandolla, Marta
    Roveda, Loris
    [J]. COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS, ICCHP-AAATE 2022, PT II, 2022, : 461 - 468
  • [23] Unsupervised Simultaneous Learning of Gestures, Actions and their Associations for Human-Robot Interaction
    Mohammad, Yasser
    Nishida, Toyoaki
    Okada, Shogo
    [J]. 2009 IEEE-RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2009, : 2537 - 2544
  • [24] An Underwater Human-Robot Interaction Using Hand Gestures for Fuzzy Control
    Jiang, Yu
    Peng, Xianglong
    Xue, Mingzhu
    Wang, Chong
    Qi, Hong
    [J]. INTERNATIONAL JOURNAL OF FUZZY SYSTEMS, 2021, 23 (06) : 1879 - 1889
  • [25] Unravelling the Robot Gestures Interpretation by Children with Autism Spectrum Disorder During Human-Robot Interaction
    Benedicto, Gema
    Juan, Carlos G.
    Fernandez-Caballero, Antonio
    Fernandez, Eduardo
    Ferrandez, Jose Manuel
    [J]. ARTIFICIAL INTELLIGENCE FOR NEUROSCIENCE AND EMOTIONAL SYSTEMS, PT I, IWINAC 2024, 2024, 14674 : 342 - 355
  • [26] A Method for Underwater Human-Robot Interaction Based on Gestures Tracking with Fuzzy Control
    Jiang, Yu
    Zhao, Minghao
    Wang, Chong
    Wei, Fenglin
    Qi, Hong
    [J]. INTERNATIONAL JOURNAL OF FUZZY SYSTEMS, 2021, 23 (07) : 2170 - 2181
  • [27] Effects of Eye Contact and Iconic Gestures on Message Retention in Human-Robot Interaction
    van Dijk, Elisabeth T.
    Torta, Elena
    Cuijpers, Raymond H.
    [J]. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2013, 5 (04) : 491 - 501
  • [28] Effects of Eye Contact and Iconic Gestures on Message Retention in Human-Robot Interaction
    Elisabeth T. van Dijk
    Elena Torta
    Raymond H. Cuijpers
    [J]. International Journal of Social Robotics, 2013, 5 : 491 - 501
  • [29] Human-Robot Interaction
    Jia, Yunyi
    Zhang, Biao
    Li, Miao
    King, Brady
    Meghdari, Ali
    [J]. JOURNAL OF ROBOTICS, 2018, 2018
  • [30] Human-Robot Interaction
    Sethumadhavan, Arathi
    [J]. ERGONOMICS IN DESIGN, 2012, 20 (03) : 27 - +