Learning Human-to-Robot Dexterous Handovers for Anthropomorphic Hand

被引:8
|
作者
Duan, Haonan [1 ,2 ]
Wang, Peng [1 ,2 ,3 ,4 ]
Li, Yiming [1 ,2 ]
Li, Daheng [1 ,2 ]
Wei, Wei [1 ,2 ]
机构
[1] Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China
[3] Chinese Acad Sci, CAS Ctr Excellence Brain Sci & Intelligence Techn, Shanghai 200031, Peoples R China
[4] Chinese Acad Sci, Hong Kong Inst Sci & Innovat, Ctr Artificial Intelligence & Robot, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Anthropomorphic hand; handovers; human-robot interaction; MOVEMENT PRIMITIVES; GRASP;
D O I
10.1109/TCDS.2022.3203025
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human-robot interaction plays an important role in robots serving human production and life. Object handover between humans and robotics is one of the fundamental problems of human-robot interaction. The majority of current work uses parallel-jaw grippers as the end-effector device, which limits the ability of the robot to grab miscellaneous objects from human and manipulate them subsequently. In this article, we present a framework for human-to-robot dexterous handover using an anthropomorphic hand. The framework takes images captured by two cameras to complete handover scene understanding, grasp configurations prediction, and handover execution. To enable the robot to generalize to diverse delivered objects with miscellaneous shapes and sizes, we propose an anthropomorphic hand grasp network (AHG-Net), an end-to-end network that takes the single-view point clouds of the object as input and predicts the suitable anthropomorphic hand configurations with five different grasp taxonomies. To train our model, we build a large-scale data set with 1M hand grasp annotations from 5K single-view point clouds of 200 objects. We implement a handover system using a UR5 robot arm and HIT-DLR II anthropomorphic robot hand based on our presented framework, which can not only adapt to different human givers but generalize to diverse novel objects with various shapes and sizes. The generalizability, reliability, and robustness of our method are demonstrated on 15 different novel objects with arbitrary handover poses from frontal and lateral positions, a system ablation study, a grasp planner comparison, and a user study on 6 participants delivering 15 objects from two benchmark sets.
引用
收藏
页码:1224 / 1238
页数:15
相关论文
共 50 条
  • [11] Model Predictive Control for Fluid Human-to-Robot Handovers
    Yang, Wei
    Sundaralingam, Balakumar
    Paxton, Chris
    Akinola, Iretiayo
    Chao, Yu-Wei
    Cakmak, Maya
    Fox, Dieter
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 6956 - 6962
  • [12] Preemptive Motion Planning for Human-to-Robot Indirect Placement Handovers
    Choi, Andrew
    Jawed, Mohammad Khalid
    Joo, Jungseock
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2022), 2022, : 4743 - 4749
  • [13] Exploration of Geometry and Forces Occurring within Human-to-Robot Handovers
    Pan, Matthew K. X. J.
    Croft, Elizabeth A.
    Niemeyer, Gunter
    2018 IEEE HAPTICS SYMPOSIUM (HAPTICS), 2018, : 327 - 333
  • [14] HandoverSim: A Simulation Framework and Benchmark for Human-to-Robot Object Handovers
    Chao, Yu-Wei
    Paxton, Chris
    Xiang, Yu
    Yang, Wei
    Sundaralingam, Balakumar
    Chen, Tao
    Murali, Adithyavairavan
    Cakmak, Maya
    Fox, Dieter
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 6941 - 6947
  • [15] Benchmark for Human-to-Robot Handovers of Unseen Containers With Unknown Filling
    Sanchez-Matilla, Ricardo
    Chatzilygeroudis, Konstantinos
    Modas, Apostolos
    Duarte, Nuno Ferreira
    Xompero, Alessio
    Frossard, Pascal
    Billard, Aude
    Cavallaro, Andrea
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (02): : 1642 - 1649
  • [16] Dexterous anthropomorphic robot hand with distributed tactile sensor: Gifu hand II
    Kawasaki, H
    Komatsu, T
    Uchiyama, K
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2002, 7 (03) : 296 - 303
  • [17] Dexterous Object Manipulation with an Anthropomorphic Robot Hand via Natural Hand Pose Transformer and Deep Reinforcement Learning
    Lopez, Patricio Rivera
    Oh, Ji-Heon
    Jeong, Jin Gyun
    Jung, Hwanseok
    Lee, Jin Hyuk
    Jaramillo, Ismael Espinoza
    Chola, Channabasava
    Lee, Won Hee
    Kim, Tae-Seong
    APPLIED SCIENCES-BASEL, 2023, 13 (01):
  • [18] A Dexterous, Reconfigurable, Adaptive Robot Hand Combining Anthropomorphic and Interdigitated Configurations
    Gao, Geng
    Chapman, Jayden
    Matsunaga, Saori
    Mariyama, Toshisada
    MacDonald, Bruce
    Liarokapis, Minas
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 7209 - 7215
  • [19] RoverNet: Vision-Based Adaptive Human-to-Robot Object Handovers
    Mavsar, Matija
    Ude, Ales
    2022 IEEE-RAS 21ST INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS), 2022, : 858 - 864
  • [20] Progressive Transfer Learning for Dexterous In-Hand Manipulation With Multifingered Anthropomorphic Hand
    Luo, Yongkang
    Li, Wanyi
    Wang, Peng
    Duan, Haonan
    Wei, Wei
    Sun, Jia
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2024, 16 (06) : 2019 - 2031