Learning from Demonstration Facilitates Human-Robot Collaborative Task Execution

被引:0
|
作者
Koskinopoulou, Maria [1 ]
Piperakis, Stylimos
Frahanias, Panos
机构
[1] Fdn Res & Technol Hellas FORTH, Inst Comp Sci, Iraklion, Greece
关键词
Learning from Demonstration; observation space; latent space; Gaussian Process; human-robot collaboration;
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Learning from Demonstration (LfD) is addressed in this work in order to establish a novel framework for Human-Robot Collaborative (HRC) task execution. In this context, a robotic system is trained to perform various actions by observing a human demonstrator. We formulate a latent representation of observed behaviors and associate this representation with the corresponding one for target robotic behaviors. Effectively, a mapping of observed to performed actions is defined, that abstracts action variations and differences between the human and robotic manipulators, and facilitates execution of newly-observed actions. The learned action-behaviors are then employed to accomplish task execution in an HRC scenario. Experimental results obtained regard the successful training of a robotic arm with various action behaviors and its subsequent deployment in HRC task accomplishment. The latter demonstrate the validity and efficacy of the proposed approach in human-robot collaborative setups.
引用
收藏
页码:59 / 66
页数:8
相关论文
共 50 条
  • [21] Human-robot collaborative assembly task planning for mobile cobots based on deep reinforcement learning
    Hou, Wenbin
    Xiong, Zhihua
    Yue, Ming
    Chen, Hao
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART C-JOURNAL OF MECHANICAL ENGINEERING SCIENCE, 2024, : 11097 - 11114
  • [22] Collaborative task execution by a human and an autonomous mobile robot in a teleoperated system
    Kawabata, K
    Ishikawa, T
    Fujii, T
    Asama, H
    Endo, I
    INTEGRATED COMPUTER-AIDED ENGINEERING, 1999, 6 (04) : 319 - 329
  • [23] Combining human guidance and structured task execution during physical human-robot collaboration
    Cacace, Jonathan
    Caccavale, Riccardo
    Finzi, Alberto
    Grieco, Riccardo
    JOURNAL OF INTELLIGENT MANUFACTURING, 2023, 34 (07) : 3053 - 3067
  • [24] A data fusion system for controlling the execution status in human-robot collaborative cells
    Argyrou, Angelos
    Giannoulis, Christos
    Sardelis, Andreas
    Karagiannis, Panagiotis
    Michalos, George
    Makris, Sotiris
    7TH CIRP CONFERENCE ON ASSEMBLY TECHNOLOGIES AND SYSTEMS (CATS 2018), 2018, 76 : 193 - 198
  • [25] Digital twin-enabled advance execution for human-robot collaborative assembly
    Liu, Sichao
    Wang, Xi Vincent
    Wang, Lihui
    CIRP ANNALS-MANUFACTURING TECHNOLOGY, 2022, 71 (01) : 25 - 28
  • [26] The PHARAOH Procedure Execution Architecture for Autonomous Robots or Collaborative Human-Robot Teams
    Hart, Stephen
    Kramer, James
    Gee, Seth
    Burridge, Robert R.
    2016 25TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2016, : 888 - 895
  • [27] Robot Learning from Human Demonstration of Peg-in-Hole Task
    Wang, Peng
    Zhu, Jianxin
    Feng, Wei
    Ou, Yongsheng
    2018 IEEE 8TH ANNUAL INTERNATIONAL CONFERENCE ON CYBER TECHNOLOGY IN AUTOMATION, CONTROL, AND INTELLIGENT SYSTEMS (IEEE-CYBER), 2018, : 318 - 322
  • [28] Learning a Pick-and-Place Robot Task from Human Demonstration
    Lin, Hsien-, I
    Cheng, Chia-Hsien
    Chen, Wei-Kai
    2013 CACS INTERNATIONAL AUTOMATIC CONTROL CONFERENCE (CACS), 2013, : 312 - +
  • [29] Role of visuospatial processes in learning from demonstration: Implications for human-robot dynamics
    Gentili, Rodolphe J.
    Oh, Hyuk
    Huang Di-Wei
    Katz, Garrett E.
    Reggia, James A.
    JOURNAL OF SPORT & EXERCISE PSYCHOLOGY, 2015, 37 : S40 - S40
  • [30] Experimental study on haptic communication of a human in a shared human-robot collaborative task
    Dumora, Julie
    Geffard, Franck
    Bidard, Catherine
    Brouillet, Thibaut
    Fraisse, Philippe
    2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2012, : 5137 - 5144