Enhancing the LLM-Based Robot Manipulation Through Human-Robot Collaboration

被引:1
|
作者
Liu, Haokun [1 ]
Zhu, Yaonan [2 ,3 ]
Kato, Kenji [4 ]
Tsukahara, Atsushi [4 ]
Kondo, Izumi [4 ]
Aoyama, Tadayoshi [3 ]
Hasegawa, Yasuhisa [3 ]
机构
[1] Nagoya Univ, Dept Mech Syst Engn, Nagoya 4648603, Japan
[2] Univ Tokyo, Sch Engn, Tokyo 1138656, Japan
[3] Nagoya Univ, Dept Micronano Mech Sci & Engn, Nagoya 4648603, Japan
[4] Natl Ctr Geriatr & Gerontol, Obu 4748511, Japan
来源
关键词
AI-enabled robotics; human-robot collaboration;
D O I
10.1109/LRA.2024.3415931
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Large Language Models (LLMs) are gaining popularity in the field of robotics. However, LLM-based robots are limited to simple, repetitive motions due to the poor integration between language models, robots, and the environment. This letter proposes a novel approach to enhance the performance of LLM-based autonomous manipulation through Human-Robot Collaboration (HRC). The approach involves using a prompted GPT-4 language model to decompose high-level language commands into sequences of motions that can be executed by the robot. The system also employs a YOLO-based perception algorithm, providing visual cues to the LLM, which aids in planning feasible motions within the specific environment. Additionally, an HRC method is proposed by combining teleoperation and Dynamic Movement Primitives (DMP), allowing the LLM-based robot to learn from human guidance. Real-world experiments have been conducted using the Toyota Human Support Robot for manipulation tasks. The outcomes indicate that tasks requiring complex trajectory planning and reasoning over environments can be efficiently accomplished through the incorporation of human demonstrations.
引用
收藏
页码:6904 / 6911
页数:8
相关论文
共 50 条
  • [41] Advancing Assembly Through Human-Robot Collaboration: Framework and Implementation
    Mohammed, Abdullah
    Wang, Lihui
    REINVENTING MECHATRONICS: DEVELOPING FUTURE DIRECTIONS FOR MECHATRONICS, 2020, : 111 - 126
  • [42] Human-Robot Collaboration: Task sharing through Virtual Reality
    Shu, Beibei
    Sziebig, Gabor
    Pieska, Sakari
    IECON 2018 - 44TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, 2018, : 6040 - 6044
  • [43] Human-Robot Interaction and User Manipulation
    Bertolini, Andrea
    Carli, Rachele
    PERSUASIVE TECHNOLOGY (PERSUASIVE 2022), 2022, 13213 : 43 - 57
  • [44] Representation and Execution of Social Plans through Human-Robot Collaboration
    Nardi, Lorenzo
    Iocchi, Luca
    SOCIAL ROBOTICS, 2014, 8755 : 266 - 275
  • [45] Evaluating Fluency in Human-Robot Collaboration
    Hoffman, Guy
    IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2019, 49 (03) : 209 - 218
  • [46] Human-Robot Collaboration for a Shared Mission
    Karami, Abir-Beatrice
    Jeanpierre, Laurent
    Mouaddib, Abdel-Illah
    PROCEEDINGS OF THE 5TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI 2010), 2010, : 155 - 156
  • [47] Qualification requirements for human-robot collaboration
    Weber M.-A.
    Schüth N.J.
    Stowasser S.
    ZWF Zeitschrift fuer Wirtschaftlichen Fabrikbetrieb, 2018, 113 (10): : 619 - 622
  • [48] Measurement of trust in human-robot collaboration
    Freedy, Amos
    DeVisser, Ewart
    Weltman, Gershon
    Coeyman, Nicole
    CTS 2007: PROCEEDINGS OF THE 2007 INTERNATIONAL SYMPOSIUM ON COLLABORATIVE TECHNOLOGIES AND SYSTEMS, 2007, : 106 - 114
  • [49] Collaboration, dialogue, and human-robot interaction
    Fong, T
    Thorpe, C
    Baur, C
    ROBOTICS RESEARCH, 2003, 6 : 255 - 266
  • [50] Multimodal Interface for Human-Robot Collaboration
    Rautiainen, Samu
    Pantano, Matteo
    Traganos, Konstantinos
    Ahmadi, Seyedamir
    Saenz, Jose
    Mohammed, Wael M.
    Lastra, Jose L. Martinez
    MACHINES, 2022, 10 (10)