Learning Grasping for Robot with Parallel Gripper from Human Demonstration via Contact Analysis

被引:1
|
作者
Zhang, Zhengshen [1 ]
Liu, Chenchen [1 ]
Zhou, Lei [1 ]
Sun, Jiawei [1 ]
Liu, Zhiyang [1 ]
Ang, Marcelo H., Jr. [2 ]
Lu, Wen Feng [2 ]
Tay, Francis E. H. [2 ]
机构
[1] Natl Univ Singapore, Adv Robot Ctr, Singapore, Singapore
[2] Natl Univ Singapore, Dept Mech Engn, Singapore, Singapore
关键词
Grasping; imitation learning; machine learning;
D O I
10.1109/ICCRE61448.2024.10589743
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent studies in the field of robotic grasping have predominantly concentrated on generating valid grasps using geometric features of target objects, employing either analytical or deep learning methods. Although such approaches have proved successful in simple picking tasks, they often fall short in task-oriented robotic grasping that demands the grasp pose to be limited to specific parts of the object. In human hand-object interaction, individuals tend to grasp particular parts of an object to facilitate subsequent tasks based on their learned knowledge from daily life experiences. Some previous research has explored the mapping of the human hand pose to a dexterous gripper with similar degrees of freedom (DoF). Nevertheless, the majority of robotic grippers are still parallel jaw grippers, which presents a challenge in mapping high DoF human hand poses to the low DoF grasp pose of a parallel gripper. In this paper, we propose three schemes that map human hand poses to the grasp pose of a parallel gripper. Our quantitative results demonstrate that the optimal mapping scheme can achieve an impressive overall success rate of 87% in robust robotic grasping. Video: https://youtu.be/PXSF6HI5u6k.
引用
收藏
页码:86 / 91
页数:6
相关论文
共 50 条
  • [1] Learning Grasping for Robot with Parallel Gripper from Human Demonstration via Contact Analysis
    Zhang, Zhengshen
    Liu, Chenchen
    Zhou, Lei
    Sun, Jiawei
    Liu, Zhiyang
    Ang, Marcelo H.
    Lu, Wen Feng
    Tay, Francis EH
    2024 9th International Conference on Control and Robotics Engineering, ICCRE 2024, 2024, : 86 - 91
  • [2] A Gripper-like Exoskeleton Design for Robot Grasping Demonstration
    Dai, Hengtai
    Lu, Zhenyu
    He, Mengyuan
    Yang, Chenguang
    ACTUATORS, 2023, 12 (01)
  • [3] Force control of a robot gripper based on human grasping schemes
    Nakazawa, N
    Kim, IH
    Inooka, H
    Ikeura, R
    CONTROL ENGINEERING PRACTICE, 2001, 9 (07) : 735 - 742
  • [4] Grasping Analysis for a 3-Finger Adaptive Robot Gripper
    Sadun, Amirul Syafiq
    Jalani, Jamaludin
    Jamil, Faizal
    2016 2ND IEEE INTERNATIONAL SYMPOSIUM ON ROBOTICS AND MANUFACTURING AUTOMATION (ROMA), 2016,
  • [5] Learning Form Closure Grasping with a Four-Pin Parallel Gripper
    Li, Rui
    Liu, Shimin
    Su, Xiaojie
    APPLIED SCIENCES-BASEL, 2023, 13 (04):
  • [6] From human grasping to robot grasping
    Oztop, Erhan
    Ozyer, Baris
    Ugur, Emre
    Kawato, Mitsuo
    NEUROSCIENCE RESEARCH, 2009, 65 : S183 - S183
  • [7] Human–robot skill transmission for mobile robot via learning by demonstration
    Jiehao Li
    Junzheng Wang
    Shoukun Wang
    Chenguang Yang
    Neural Computing and Applications, 2023, 35 : 23441 - 23451
  • [8] Learning from Human Hand Demonstration for Wire Harness Grasping
    Kamiya, Keita
    Wang, Yusheng
    Lu, Jiaxi
    Kondoh, Shinsuke
    Kanda, Shinji
    Honda, Yukio
    Mizoguchi, Hiroshi
    Nishio, Masahiro
    Makino, Koji
    Ota, Jun
    2024 IEEE INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS, AIM 2024, 2024, : 1645 - 1650
  • [9] An Intuitive Robot Learning from Human Demonstration
    Ogenyi, Uchenna Emeoha
    Zhang, Gongyue
    Yang, Chenguang
    Ju, Zhaojie
    Liu, Honghai
    INTELLIGENT ROBOTICS AND APPLICATIONS (ICIRA 2018), PT I, 2018, 10984 : 176 - 185
  • [10] Learning Grasping Force from Demonstration
    Lin, Yun
    Ren, Shaogang
    Clevenger, Matthew
    Sun, Yu
    2012 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2012, : 1526 - 1531