Robot Learning of Assistive Manipulation Tasks by Demonstration via Head Gesture-based Interface

被引:0
|
作者
Kyrarini, Maria [1 ]
Zheng, Quan [1 ]
Haseeb, Muhammad Abdul [1 ]
Graeser, Axel [1 ]
机构
[1] Univ Bremen, Inst Automat, Bremen, Germany
关键词
robot learning from human demonstration; head gesture-based interlace; human-robot interaction; assistive robots;
D O I
10.1109/icorr.2019.8779379
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Assistive robotic manipulators have the potential to support the lives of people suffering from severe motor impairments. They can support individuals with disabilities to independently perform daily living activities, such as drinking, eating, manipulation tasks, and opening doors. An attractive solution is to enable motor impaired users to teach a robot by providing demonstrations of daily living tasks. The user controls the robot 'manually' with an intuitive human-robot interface to provide demonstration, which is followed by the robot learning of the performed task. However, the control of robotic manipulators by motor impaired individuals is a challenging topic. In this paper, a novel head gesture-based interface for hands-free robot control and a framework for robot learning from demonstration are presented. The head gesture-based interface consists of a camera mounted on the user's hat, which records the changes in the viewed scene due to the head motion. The head gesture recognition is performed using the optical flow for feature extraction and support vector machine for gesture classification. The recognized head gestures are further mapped into robot control commands to perform object manipulation task. The robot learns the demonstrated task by generating the sequence of actions and Gaussian Mixture Model method is used to segment the demonstrated path of the robot's end-effector. During the robotic reproduction of the task, the modified Gaussian Mixture Model and Gaussian Mixture Regression are used to adapt to environmental changes. The proposed framework was evaluated in a real-world assistive robotic scenario in a small study involving 13 participants; 12 able-bodied and one tetraplegic. The presented results demonstrate a potential of the proposed framework to enable severe motor impaired individuals to demonstrate daily living tasks to robotic manipulators.
引用
收藏
页码:1139 / 1146
页数:8
相关论文
共 50 条
  • [1] Lip gesture-based robot manipulation
    Gomez, Juan B.
    Hernandez, Jorge E.
    Prieto, Flavio
    [J]. DYNA-COLOMBIA, 2008, 75 (154): : 187 - 198
  • [2] Head Gesture-based Control for Assistive Robots
    Haseeb, Muhammad Abdul
    Kyrarini, Maria
    Jiang, Shuo
    Ristic-Durrant, Danijela
    Graeser, Axel
    [J]. 11TH ACM INTERNATIONAL CONFERENCE ON PERVASIVE TECHNOLOGIES RELATED TO ASSISTIVE ENVIRONMENTS (PETRA 2018), 2018, : 379 - 383
  • [3] Gesture-Based Intelligent User Interface for Control of an Assistive Mobile Information Robot
    Kagirov, Ildar
    Ryumin, Dmitry
    Zelezny, Milos
    [J]. INTERACTIVE COLLABORATIVE ROBOTICS, ICR 2020, 2020, 12336 : 126 - 134
  • [4] Gesture-Based Telemanipulation of A Humanoid Robot for Home Service Tasks
    Yu, Ningbo
    Xu, Chang
    Wang, Kui
    Yang, Zhuo
    Liu, Jingtai
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON CYBER TECHNOLOGY IN AUTOMATION, CONTROL, AND INTELLIGENT SYSTEMS (CYBER), 2015, : 1923 - 1927
  • [5] Head Motion and Head Gesture-Based Robot Control: A Usability Study
    Jackowski, Anja
    Gebhard, Marion
    Thietje, Roland
    [J]. IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2018, 26 (01) : 161 - 170
  • [6] A Gesture-based Multimodal Interface for Human-Robot Interaction
    Uimonen, Mikael
    Kemppi, Paul
    Hakanen, Taru
    [J]. 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 165 - 170
  • [7] An experiment study of gesture-based human-robot interface
    Xu, Yong
    Gulillemot, Matthieu
    Nishida, Toyoaki
    [J]. 2007 IEEE/ICME INTERNATIONAL CONFERENCE ON COMPLEX MEDICAL ENGINEERING, VOLS 1-4, 2007, : 457 - 463
  • [8] Developing a gesture-based interface
    Gupta, N
    Mittal, P
    Roy, SD
    Chaudhury, S
    Banerjee, S
    [J]. IETE JOURNAL OF RESEARCH, 2002, 48 (3-4) : 237 - 244
  • [9] Gesture-based programming: A preliminary demonstration
    Voyles, RM
    Khosla, PK
    [J]. ICRA '99: IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, PROCEEDINGS, 1999, : 708 - 713
  • [10] On Field Gesture-based Human-Robot Interface for Emergency Responders
    De Cillis, Francesca
    Oliva, Gabriele
    Pascucci, Federica
    Setola, Roberto
    Tesei, Marco
    [J]. 2013 IEEE INTERNATIONAL SYMPOSIUM ON SAFETY, SECURITY, AND RESCUE ROBOTICS (SSRR), 2013,