Robot Reaching Movement Synthesis by Human Demonstration

被引:0
|
作者
Lin, Hsien-I [1 ]
Lai, Chun-Chia [1 ]
机构
[1] Natl Taipei Univ Technol, Grad Inst Automat Technol, Taipei, Taiwan
关键词
Terms Robot reaching; human demonstration; Gaussian Mixture Model; collision-free; SYSTEMS; ARM;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Reaching to an object is a fundamental skill to a robot. The purpose of robot reaching aims at bringing the robot hands to the object location without obstacle collision. Since implementing such robot reaching movement is a tedious task, this paper proposes a method to synthesize robot reaching movement by human demonstration. However, human demonstrations are inconsistent, this paper adopts Gaussian Mixture Model (GMM) to obtain generalized reaching movement which is then used to synthesize new movement to avoid the obstacle by adjusting the parameters of its Gaussian model. With the proposed method, robot reaching movement is goal-directed, collision-free, and human-like. We validated the proposed method on a Aldebaran Robotics NAO humanoid robot with 25 degrees of freedom. The results showed that the NAO robot was able to perform skillful reaching movement to the object.
引用
收藏
页码:980 / 985
页数:6
相关论文
共 50 条
  • [21] An Intuitive Robot Learning from Human Demonstration
    Ogenyi, Uchenna Emeoha
    Zhang, Gongyue
    Yang, Chenguang
    Ju, Zhaojie
    Liu, Honghai
    INTELLIGENT ROBOTICS AND APPLICATIONS (ICIRA 2018), PT I, 2018, 10984 : 176 - 185
  • [22] A noncontact robot demonstration method with human supervision
    ZHANG Qi
    XIE ZongWu
    LIU Yang
    Science China(Technological Sciences) , 2021, (11) : 2360 - 2372
  • [23] A noncontact robot demonstration method with human supervision
    Zhang, Qi
    Xie, ZongWu
    Liu, Yang
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2021, 64 (11) : 2360 - 2372
  • [24] A noncontact robot demonstration method with human supervision
    ZHANG Qi
    XIE ZongWu
    LIU Yang
    Science China(Technological Sciences), 2021, 64 (11) : 2360 - 2372
  • [25] A Human-Robot Collaboration Framework Based on Human Collaboration Demonstration and Robot Learning
    Peng, Xiang
    Jiang, Jingang
    Xia, Zeyang
    Xiong, Jing
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2024, PT VII, 2025, 15207 : 286 - 299
  • [26] Supporting human reaching motion using a branching robot arm
    Yoshioka, Yuta
    Takase, Yutaka
    Yamazaki, Kimitoshi
    INTERNATIONAL JOURNAL OF INTELLIGENT ROBOTICS AND APPLICATIONS, 2025, : 4 - 20
  • [27] Robot Learning From Demonstration for Assembly With Sequential Assembly Movement Primitives
    Hu, Haopeng
    Yan, Hengyuan
    Yang, Xiansheng
    Lou, Yunjiang
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2024, 29 (04) : 2685 - 2696
  • [28] A Study of Reaching Motions for Collaborative Human-Robot Interaction
    Sheikholeslami, Sara
    Lee, Gilwoo
    Hart, Justin W.
    Srinivasa, Siddhartha
    Croft, Elizabeth A.
    PROCEEDINGS OF THE 2018 INTERNATIONAL SYMPOSIUM ON EXPERIMENTAL ROBOTICS, 2020, 11 : 584 - 594
  • [29] Human–robot skill transmission for mobile robot via learning by demonstration
    Jiehao Li
    Junzheng Wang
    Shoukun Wang
    Chenguang Yang
    Neural Computing and Applications, 2023, 35 : 23441 - 23451
  • [30] Unsupervised early prediction of human reaching for human–robot collaboration in shared workspaces
    Ruikun Luo
    Rafi Hayne
    Dmitry Berenson
    Autonomous Robots, 2018, 42 : 631 - 648