Robot Learning of Assistive Manipulation Tasks by Demonstration via Head Gesture-based Interface

被引:0
|
作者
Kyrarini, Maria [1 ]
Zheng, Quan [1 ]
Haseeb, Muhammad Abdul [1 ]
Graeser, Axel [1 ]
机构
[1] Univ Bremen, Inst Automat, Bremen, Germany
关键词
robot learning from human demonstration; head gesture-based interlace; human-robot interaction; assistive robots;
D O I
10.1109/icorr.2019.8779379
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Assistive robotic manipulators have the potential to support the lives of people suffering from severe motor impairments. They can support individuals with disabilities to independently perform daily living activities, such as drinking, eating, manipulation tasks, and opening doors. An attractive solution is to enable motor impaired users to teach a robot by providing demonstrations of daily living tasks. The user controls the robot 'manually' with an intuitive human-robot interface to provide demonstration, which is followed by the robot learning of the performed task. However, the control of robotic manipulators by motor impaired individuals is a challenging topic. In this paper, a novel head gesture-based interface for hands-free robot control and a framework for robot learning from demonstration are presented. The head gesture-based interface consists of a camera mounted on the user's hat, which records the changes in the viewed scene due to the head motion. The head gesture recognition is performed using the optical flow for feature extraction and support vector machine for gesture classification. The recognized head gestures are further mapped into robot control commands to perform object manipulation task. The robot learns the demonstrated task by generating the sequence of actions and Gaussian Mixture Model method is used to segment the demonstrated path of the robot's end-effector. During the robotic reproduction of the task, the modified Gaussian Mixture Model and Gaussian Mixture Regression are used to adapt to environmental changes. The proposed framework was evaluated in a real-world assistive robotic scenario in a small study involving 13 participants; 12 able-bodied and one tetraplegic. The presented results demonstrate a potential of the proposed framework to enable severe motor impaired individuals to demonstrate daily living tasks to robotic manipulators.
引用
收藏
页码:1139 / 1146
页数:8
相关论文
共 50 条
  • [41] Spatially Unconstrained, Gesture-Based Human-Robot Interaction
    Doisy, Guillaume
    Jevtic, Aleksandar
    Bodiroza, Sasa
    [J]. PROCEEDINGS OF THE 8TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI 2013), 2013, : 117 - +
  • [42] The influence of Gesture-Based Learning System (GBLS) on Learning Outcomes
    Shakroum, Moamer
    Wong, Kok Wai
    Fung, Chun Che
    [J]. COMPUTERS & EDUCATION, 2018, 117 : 75 - 101
  • [43] Gesture-Based Object Localization for Robot Applications in Intelligent Environments
    Sprute, Dennis
    Rasch, Robin
    Poertner, Aljoscha
    Battermann, Sven
    Koenig, Matthias
    [J]. 2018 14TH INTERNATIONAL CONFERENCE ON INTELLIGENT ENVIRONMENTS (IE 2018), 2018, : 48 - 55
  • [44] Assistive Mobile Manipulation for Self-Care Tasks Around the Head
    Hawkins, Kelsey P.
    Grice, Phillip M.
    Chen, Tiffany L.
    King, Chih-Hung
    Kemp, Charles C.
    [J]. 2014 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE IN ROBOTIC REHABILITATION AND ASSISTIVE TECHNOLOGIES (CIR2AT 2014), 2014,
  • [45] Gesture-based Language for Diver-Robot Underwater Interaction
    Chiarella, D.
    Bibuli, M.
    Bruzzone, G.
    Caccia, M.
    Ranieri, A.
    Zereik, E.
    Marconi, L.
    Cutugno, P.
    [J]. OCEANS 2015 - GENOVA, 2015,
  • [46] Visual gesture-based robot guidance with a modular neural system
    Littman, E
    Drees, A
    Ritter, H
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 903 - 909
  • [47] AMiCUS-A Head Motion-Based Interface for Control of an Assistive Robot
    Rudigkeit, Nina
    Gebhard, Marion
    [J]. SENSORS, 2019, 19 (12)
  • [48] On Field Gesture-Based Robot-to-Robot Communication with NAO Soccer Players
    Di Giambattista, Valerio
    Fawakherji, Mulham
    Suriani, Vincenzo
    Bloisi, Domenico D.
    Nardi, Daniele
    [J]. ROBOT WORLD CUP XXIII, ROBOCUP 2019, 2019, 11531 : 367 - 375
  • [49] Gesture-based Robot's Long-range Navigation
    Li, Mo
    Pan, Wei
    [J]. PROCEEDINGS OF 2012 7TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE & EDUCATION, VOLS I-VI, 2012, : 996 - 1001
  • [50] Robust Gesture-Based Interaction System for Manipulating Service Robot
    Xie, Qunqun
    Liang, Guoyuan
    Tang, Cheng
    Wu, Xinyu
    [J]. 2013 INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY (ICIST), 2013, : 658 - 662