Scalable, Intuitive Human to Robot Skill Transfer with Wearable Human Machine Interfaces: On Complex, Dexterous Tasks

被引:0
|
作者
Sanches, Felipe [1 ]
Gao, Geng [2 ]
Elangovan, Nathan [1 ]
Godoy, Ricardo V. [1 ]
Chapman, Jayden [1 ]
Wang, Ke [2 ]
Jarvis, Patrick [2 ]
Liarokapis, Minas [1 ]
机构
[1] Univ Auckland, Dept Mech & Mechatron Engn, New Dexter Res Grp, Auckland, New Zealand
[2] Acumino, Auckland, New Zealand
关键词
D O I
10.1109/IROS55552.2023.10341661
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The advent of collaborative industrial and house-hold robotics has blurred the demarcation between the human and robot workspace. The capability of robots to function efficiently alongside humans requires new research to be conducted in dynamic environments as opposed to the traditional well-structured laboratory. In this work, we propose an efficient skill transfer methodology comprising intuitive interfaces, efficient optical tracking systems, and compliant control of robotic arm-hand systems. The lightweight wearable interfaces mounted with robotic grippers and hands allow the execution of dexterous activities in dynamic environments without restricting human dexterity. The fiducial and reflective markers mounted on the interfaces facilitate the extraction of positional and rotational information allowing efficient trajectory tracking. As the tasks are performed using the mounted grippers and hands, gripper state information can be directly transferred. The hardware-agnostic nature and efficiency of the proposed interfaces and skill transfer methodology are demonstrated through the execution of complex tasks that require increased dexterity, writing and drawing.
引用
收藏
页码:6318 / 6325
页数:8
相关论文
共 50 条
  • [1] On Wearable, Lightweight, Low-Cost Human Machine Interfaces for the Intuitive Collection of Robot Grasping and Manipulation Data
    Chang, Che-Ming
    Chapman, Jayden
    Wang, Ke
    Jarvis, Patrick
    Liarokapis, Minas
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 8090 - 8096
  • [2] Considering Socially Scalable Human-Robot Interfaces
    Benjamin, Victor
    ACM TRANSACTIONS ON MANAGEMENT INFORMATION SYSTEMS, 2024, 15 (04)
  • [3] Preliminary Experiments of a Human-Portable Underwater Gripper robot for Dexterous Tasks
    Ishizu, Kensei
    Nakayama, Haruki
    Sakagami, Norimitsu
    Shibata, Mizuho
    Kawamura, Sadao
    Matsuda, Shinji
    Mitsui, Atsushi
    OCEANS 2014 - TAIPEI, 2014,
  • [4] Human-Machine Interaction for Intuitive Programming of Assembly Tasks in Construction
    Stumm, Sven
    Braumann, Johannes
    Brell-Cokcan, Sigrid
    6TH CIRP CONFERENCE ON ASSEMBLY TECHNOLOGIES AND SYSTEMS (CATS), 2016, 44 : 269 - 274
  • [5] Skill Learning for Human-Robot Interaction Using Wearable Device
    Fang, Bin
    Wei, Xiang
    Sun, Fuchun
    Huang, Haiming
    Yu, Yuanlong
    Liu, Huaping
    TSINGHUA SCIENCE AND TECHNOLOGY, 2019, 24 (06) : 654 - 662
  • [6] Skill Learning for Human-Robot Interaction Using Wearable Device
    Bin Fang
    Xiang Wei
    Fuchun Sun
    Haiming Huang
    Yuanlong Yu
    Huaping Liu
    Tsinghua Science and Technology, 2019, 24 (06) : 654 - 662
  • [7] Research Progress on Human-robot Skill Transfer
    Zeng C.
    Yang C.-G.
    Li Q.
    Dai S.-L.
    Zidonghua Xuebao/Acta Automatica Sinica, 2019, 45 (10): : 1813 - 1828
  • [8] Skill learning framework for human-robot interaction and manipulation tasks
    Odesanmi, Abiodun
    Wang, Qining
    Mai, Jingeng
    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2023, 79
  • [9] Dynamically adaptive soft metamaterial for wearable human–machine interfaces
    Ugur Tanriverdi
    Guglielmo Senesi
    Tarek Asfour
    Hasan Kurt
    Sabrina L. Smith
    Diana Toderita
    Joseph Shalhoub
    Laura Burgess
    Anthony M. J. Bull
    Firat Güder
    Nature Communications, 16 (1)
  • [10] A Lightweight Ultrasound Probe for Wearable Human-Machine Interfaces
    Yan, Jipeng
    Yang, Xingchen
    Sun, Xueli
    Chen, Zhenfeng
    Liu, Honghai
    IEEE SENSORS JOURNAL, 2019, 19 (14) : 5895 - 5903