Enhancing the LLM-Based Robot Manipulation Through Human-Robot Collaboration

被引:1
|
作者
Liu, Haokun [1 ]
Zhu, Yaonan [2 ,3 ]
Kato, Kenji [4 ]
Tsukahara, Atsushi [4 ]
Kondo, Izumi [4 ]
Aoyama, Tadayoshi [3 ]
Hasegawa, Yasuhisa [3 ]
机构
[1] Nagoya Univ, Dept Mech Syst Engn, Nagoya 4648603, Japan
[2] Univ Tokyo, Sch Engn, Tokyo 1138656, Japan
[3] Nagoya Univ, Dept Micronano Mech Sci & Engn, Nagoya 4648603, Japan
[4] Natl Ctr Geriatr & Gerontol, Obu 4748511, Japan
来源
关键词
AI-enabled robotics; human-robot collaboration;
D O I
10.1109/LRA.2024.3415931
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Large Language Models (LLMs) are gaining popularity in the field of robotics. However, LLM-based robots are limited to simple, repetitive motions due to the poor integration between language models, robots, and the environment. This letter proposes a novel approach to enhance the performance of LLM-based autonomous manipulation through Human-Robot Collaboration (HRC). The approach involves using a prompted GPT-4 language model to decompose high-level language commands into sequences of motions that can be executed by the robot. The system also employs a YOLO-based perception algorithm, providing visual cues to the LLM, which aids in planning feasible motions within the specific environment. Additionally, an HRC method is proposed by combining teleoperation and Dynamic Movement Primitives (DMP), allowing the LLM-based robot to learn from human guidance. Real-world experiments have been conducted using the Toyota Human Support Robot for manipulation tasks. The outcomes indicate that tasks requiring complex trajectory planning and reasoning over environments can be efficiently accomplished through the incorporation of human demonstrations.
引用
收藏
页码:6904 / 6911
页数:8
相关论文
共 50 条
  • [21] Safety in human-robot collaboration
    Hofbaur, M.
    Rathmair, M.
    ELEKTROTECHNIK UND INFORMATIONSTECHNIK, 2019, 136 (07): : 301 - 306
  • [22] Effective Human-Robot Collaboration Through Wearable Sensors
    Al-Yacoub, Ali
    Buerkle, Achim
    Flanagan, Myles
    Ferreira, Pedro
    Hubbard, Ella-Mae
    Lohse, Niels
    2020 25TH IEEE INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES AND FACTORY AUTOMATION (ETFA), 2020, : 651 - 658
  • [23] Enhanced indoor positioning through human-robot collaboration
    Mengyuan Tang
    Baoding Zhou
    Xuanke Zhong
    Xu Liu
    Qingquan Li
    Urban Informatics, 3 (1):
  • [24] Human-Robot Collaboration based on Dynamic Compensation: from Micro-manipulation to Macro-manipulation
    Huang, Shouren
    Ishikawa, Masatoshi
    Yamakawa, Yuji
    2018 27TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (IEEE RO-MAN 2018), 2018, : 603 - 604
  • [25] Human modeling for human-robot collaboration
    Hiatt, Laura M.
    Narber, Cody
    Bekele, Esube
    Khemlani, Sangeet S.
    Trafton, J. Gregory
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2017, 36 (5-7): : 580 - 596
  • [26] Toward Cognitive Digital Twin System of Human-Robot Collaboration Manipulation
    Li, Xin
    He, Bin
    Wang, Zhipeng
    Zhou, Yanmin
    Li, Gang
    Li, Xiang
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2025, 22 : 6677 - 6690
  • [27] Anticipatory Robot Control for Efficient Human-Robot Collaboration
    Huang, Chien-Ming
    Mutlu, Bilge
    ELEVENTH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN ROBOT INTERACTION (HRI'16), 2016, : 83 - 90
  • [28] Using a LLM-Based Conversational Agent in the Social Robot Mini
    Esteban-Lozano, Ivan
    Castro-Gonzalez, Alvaro
    Martinez, Paloma
    ARTIFICIAL INTELLIGENCE IN HCI, PT III, AI-HCI 2024, 2024, 14736 : 15 - 26
  • [29] Simulation-based Digital Twin for enhancing human-robot collaboration in assembly systems
    Cimino, Antonio
    Longo, Francesco
    Nicoletti, Letizia
    Solina, Vittorio
    JOURNAL OF MANUFACTURING SYSTEMS, 2024, 77 : 903 - 918
  • [30] Robot Collaboration and Model Reliance Based on Its Trust in Human-Robot Interaction
    Alhaji, Basel
    Prilla, Michael
    Rausch, Andreas
    HUMAN-COMPUTER INTERACTION - INTERACT 2023, PT II, 2023, 14143 : 17 - 39