Virtual Reality and Programming by Demonstration: Teaching a Robot to Grasp a Dynamic Object by the Generalization of Human Demonstrations

被引:6
|
作者
Hamon, Ludovic [1 ]
Lucidarme, Philippe [1 ]
Richard, Emmanuelle [1 ]
Richard, Paul [1 ]
机构
[1] Univ Angers, LISA Lab, F-49000 Angers, France
来源
关键词
TASK KNOWLEDGE;
D O I
10.1162/PRES_a_00047
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Humans possess the ability to perform complex manipulations without the need to consciously perceive detailed motion plans. When a large number of trials and tests are required for techniques such as learning by imitation and programming by demonstration, the virtual reality approach provides an effective method. Indeed, virtual environments can be built economically and quickly, and can be automatically reinitialized. In the fields of robotics and virtual reality, this has now become commonplace. Rather than imitating human actions, our focus is to develop an intuitive and interactive method based on user demonstrations to create humanlike, autonomous behavior for a virtual character or robot. Initially, a virtual character is built via real-time virtual simulation in which the user demonstrates the task by controlling the virtual agent. The necessary data (position, speed, etc.) to accomplish the task are acquired in a Cartesian space during the demonstration session. These data are then generalized off-line by using a neural network with a back-propagation algorithm. The objective is to model a function that represents the studied task, and by so doing, to adapt the agent to deal with new cases. In this study, the virtual agent is a 6-DOF arm manipulator, Kuka Kr6, and the task is to grasp a ball thrown into its workspace. Our approach is to find a minimum number of necessary demonstrations while maintaining adequate task efficiency. Moreover, the relationship between the number of dimensions of the estimated function and the number of human trials is studied, depending on the evolution of the learning system.
引用
收藏
页码:241 / 253
页数:13
相关论文
共 50 条
  • [31] Learn to Grasp Objects with Dexterous Robot Manipulator from Human Demonstration
    Hu, Yuandong
    Li, Ke
    Wei, Na
    2022 INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND MECHATRONICS (ICARM 2022), 2022, : 1062 - 1067
  • [32] Part-Based Robot Grasp Planning from Human Demonstration
    Aleotti, Jacopo
    Caselli, Stefano
    2011 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2011,
  • [33] A Constrained DMPs Framework for Robot Skills Learning and Generalization From Human Demonstrations
    Lu, Zhenyu
    Wang, Ning
    Yang, Chenguang
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2021, 26 (06) : 3265 - 3275
  • [34] Experimental analysis of augmented reality interfaces for robot programming by demonstration in manufacturing
    Chu, Chih-Hsing
    Weng, Chen-Yu
    JOURNAL OF MANUFACTURING SYSTEMS, 2024, 74 : 463 - 476
  • [35] Human grasp position estimation for human-robot cooperative object manipulation
    Ansari, Ramin Jaberzadeh
    Giordano, Giuseppe
    Sjoberg, Jonas
    Karayiannidis, Yiannis
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2020, 131
  • [36] Robot grasp synthesis from virtual demonstration and topology-preserving environment reconstruction
    Aleotti, Jacopo
    Caselli, Stefano
    2007 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-9, 2007, : 2698 - 2703
  • [37] Hand-Object Interaction: From Human Demonstrations to Robot Manipulation
    Carfi, Alessandro
    Patten, Timothy
    Kuang, Yingyi
    Hammoud, Ali
    Alameh, Mohamad
    Maiettini, Elisa
    Weinberg, Abraham Itzhak
    Faria, Diego
    Mastrogiovanni, Fulvio
    Alenya, Guillem
    Natale, Lorenzo
    Perdereau, Veronique
    Vincze, Markus
    Billard, Aude
    FRONTIERS IN ROBOTICS AND AI, 2021, 8
  • [38] 6D Object Pose Estimation for Robot Programming by Demonstration
    Ghahramani, Mohammad
    Vakanski, Aleksandar
    Janabi-Sharifi, Farrokh
    PROGRESS IN OPTOMECHATRONIC TECHNOLOGIES, 2019, 233 : 93 - 101
  • [39] Hand Pose Estimation for Robot Programming by Demonstration in Object Manipulation Tasks
    Xu, Jun
    Kun, Qian
    Liu, Huan
    Ma, Xudong
    2018 37TH CHINESE CONTROL CONFERENCE (CCC), 2018, : 5328 - 5333
  • [40] Human to Robot Demonstrations of Routine Home Tasks: Adaptation to the Robot's Preferred Style of Demonstration
    Alissandrakis, Aris
    Miyake, Yoshihiro
    RO-MAN 2009: THE 18TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, VOLS 1 AND 2, 2009, : 596 - 601