Improving Tracking through Human-Robot Sensory Augmentation

被引:15
|
作者
Li, Yanan [1 ,2 ]
Eden, Jonathan [1 ]
Carboni, Gerolamo [1 ]
Burdet, Etienne [1 ]
机构
[1] Imperial Coll Sci Technol & Med, Dept Bioengn, London SW7 2AZ, England
[2] Univ Sussex, Dept Engn & Informat, Brighton BN1 9RH, E Sussex, England
基金
欧盟地平线“2020”; 英国工程与自然科学研究理事会;
关键词
Cooperating robots; human-centered robotics; physical human-robot interaction;
D O I
10.1109/LRA.2020.2998715
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
This letter introduces a sensory augmentation technique enabling a contact robot to understand its human user's control in real-time and integrate their reference trajectory information into its own sensory feedback to improve the tracking performance. The human's control is formulated as a feedback controller with unknown control gains and desired trajectory. An unscented Kalman filter is used to estimate first the control gains and then the desired trajectory. The estimated human's desired trajectory is used as augmented sensory information about the system and combined with the robot's measurement to estimate a reference trajectory. Simulations and an implementation on a robotic interface demonstrate that the reactive control can robustly identify the human user's control, and that the sensory augmentation improves the robot's tracking performance.
引用
收藏
页码:4399 / 4406
页数:8
相关论文
共 50 条
  • [21] Performance metrices for improving human-robot interaction
    Gatsoulis, Yiannis
    ADVANCES IN CLIMBING AND WALKING ROBOTS, PROCEEDINGS, 2007, : 716 - 725
  • [22] Improving Human-Robot Interaction by a Multimodal Interface
    Ubeda, Andres
    Ianez, Eduardo
    Azorin, Jose M.
    Sabater, Jose M.
    Garcia, Nicolas M.
    Perez, Carlos
    IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2010), 2010, : 3580 - 3585
  • [23] Enhancing a human-robot interface using sensory egosphere
    Johnson, C
    Koku, AB
    Kawamura, K
    Peters, RA
    2002 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS I-IV, PROCEEDINGS, 2002, : 4132 - 4137
  • [24] Improving Human-Robot Team Transparency with Eye-tracking based Situation Awareness Assessment
    Aderinto, Favour
    Smith, Josh Bhagat
    Giolando, Mark-Robin
    Baskaran, Prakash
    Adams, Julie A.
    COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 177 - 181
  • [25] Improving Human-Robot Interaction Effectiveness in Human-Robot Collaborative Object Transportation using Force Prediction
    Dominguez-Vidal, J. E.
    Sanfeliu, Alberto
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2023, : 7839 - 7845
  • [26] Human tracking and silhouette extraction for human-robot interaction systems
    Ahn, Jung-Ho
    Choi, Cheolmin
    Kwak, Sooyeong
    Kim, Kilcheon
    Byun, Hyeran
    PATTERN ANALYSIS AND APPLICATIONS, 2009, 12 (02) : 167 - 177
  • [27] Intuitive human-robot interaction through active 3D gaze tracking
    Atienza, R
    Zelinsky, A
    Robotics Research, 2005, 15 : 172 - 181
  • [28] Flexible Human-robot interaction: collaborative robot integrated with hand tracking
    Ochoa, Oscar
    Mendez, Enrico
    Lucas-Dophe, Carolina
    Luna-Sanchez, Jose Alfredo
    Soto-Herrera, Victor Hugo
    Olivera-Guzman, David
    Alvarado Perez, Miriam
    del-Real, Eloina
    Ayala-Garcia, Ivo N.
    Gonzalez, Alejandro
    2023 XXV ROBOTICS MEXICAN CONGRESS, COMROB, 2023, : 25 - 30
  • [29] A Beat-Tracking Robot for Human-Robot Interaction and Its Evaluation
    Murata, Kazumasa
    Nakadai, Kazuhiro
    Takeda, Ryu
    Okuno, Hiroshi G.
    Torii, Toyotaka
    Hasegawa, Yuji
    Tsujino, Hiroshi
    2008 8TH IEEE-RAS INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS 2008), 2008, : 79 - +
  • [30] Improving Workers' Musculoskeletal Health During Human-Robot Collaboration Through Reinforcement Learning
    Xie, Ziyang
    Lu, Lu
    Wang, Hanwen
    Su, Bingyi
    Liu, Yunan
    Xu, Xu
    HUMAN FACTORS, 2024, 66 (06) : 1754 - 1769