Inferring Guidance Information in Cooperative Human-Robot Tasks

被引:0
|
作者
Berger, Erik [1 ]
Vogt, David [1 ]
Haji-Ghassemi, Nooshin [2 ]
Jung, Bernhard [1 ]
Ben Amor, Heni [2 ]
机构
[1] Tech Univ Bergakad Freiberg, Inst Comp Sci, Bernhard Von Cotta Str 2, D-09599 Freiberg, Germany
[2] Tech Univ Darmstadt, Dept Comp Sci, Hochschulstr 10, D-64289 Darmstadt, Germany
关键词
D O I
暂无
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
In many cooperative tasks between a human and a robotic assistant, the human guides the robot by exerting forces, either through direct physical interaction or indirectly via a jointly manipulated object. These physical forces perturb the robot's behavior execution and need to be compensated for in order to successfully complete such tasks. Typically, this problem is tackled by means of special purpose force sensors which are, however, not available on many robotic platforms. In contrast, we propose a machine learning approach based on sensor data, such as accelerometer and pressure sensor information. In the training phase, a statistical model of behavior execution is learned that combines Gaussian Process Regression with a novel periodic kernel. During behavior execution, predictions from the statistical model are continuously compared with stability parameters derived from current sensor readings. Differences between predicted and measured values exceeding the variance of the statistical model are interpreted as guidance information and used to adapt the robot's behavior. Several examples of cooperative tasks between a human and a humanoid NAO robot demonstrate the feasibility of our approach.
引用
收藏
页码:124 / 129
页数:6
相关论文
共 50 条
  • [1] Shared Control for Human-Robot Cooperative Manipulation Tasks
    Petric, Tadej
    Cevzar, Misel
    Babic, Jan
    [J]. ADVANCES IN SERVICE AND INDUSTRIAL ROBOTICS, 2018, 49 : 787 - 796
  • [2] Bilateral Haptic Collaboration for Human-Robot Cooperative Tasks
    Salvietti, Gionata
    Iqbal, Muhammad Zubair
    Prattichizzo, Domenico
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (02) : 3517 - 3524
  • [3] Predicting Human Intent for Cooperative Physical Human-Robot Interaction Tasks
    Maithani, Harsh
    Ramon, Juan Antonio Corrales
    Mezouar, Youcef
    [J]. 2019 IEEE 15TH INTERNATIONAL CONFERENCE ON CONTROL AND AUTOMATION (ICCA), 2019, : 1523 - 1528
  • [4] Extending Commands Embedded in Actions for Human-Robot Cooperative Tasks
    Kobayashi, Kazuki
    Yamada, Seiji
    [J]. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2010, 2 (02) : 159 - 173
  • [5] Extending Commands Embedded in Actions for Human-Robot Cooperative Tasks
    Kazuki Kobayashi
    Seiji Yamada
    [J]. International Journal of Social Robotics, 2010, 2 : 159 - 173
  • [6] Planning of proactive behaviors for human-robot cooperative tasks under uncertainty
    Kwon, Woo Young
    Suh, Il Hong
    [J]. KNOWLEDGE-BASED SYSTEMS, 2014, 72 : 81 - 95
  • [7] Virtual fixtures with autonomous error compensation for human-robot cooperative tasks
    Castillo-Cruces, Raul A.
    Wahrburg, Juergen
    [J]. ROBOTICA, 2010, 28 : 267 - 277
  • [8] Development of a Human-Robot Cooperative System Based on Visual Information
    Sato, Akitoshi
    Tan, Joo Kooi
    Ono, Yuta
    [J]. INTERNATIONAL WORKSHOP ON ADVANCED IMAGE TECHNOLOGY (IWAIT) 2019, 2019, 11049
  • [9] Multi-modal Proactive Approaching of Humans for Human-Robot Cooperative Tasks
    Naik, Lakshadeep
    Palinko, Oskar
    Bodenhagen, Leon
    Krueger, Norbert
    [J]. 2021 30TH IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2021, : 323 - 329
  • [10] Human-Robot Interaction Through Fingertip Haptic Devices for Cooperative Manipulation Tasks
    Music, Selma
    Prattichizzo, Domenico
    Hirche, Sandra
    [J]. 2019 28TH IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2019,