Improving Tracking through Human-Robot Sensory Augmentation

被引:15
|
作者
Li, Yanan [1 ,2 ]
Eden, Jonathan [1 ]
Carboni, Gerolamo [1 ]
Burdet, Etienne [1 ]
机构
[1] Imperial Coll Sci Technol & Med, Dept Bioengn, London SW7 2AZ, England
[2] Univ Sussex, Dept Engn & Informat, Brighton BN1 9RH, E Sussex, England
基金
欧盟地平线“2020”; 英国工程与自然科学研究理事会;
关键词
Cooperating robots; human-centered robotics; physical human-robot interaction;
D O I
10.1109/LRA.2020.2998715
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
This letter introduces a sensory augmentation technique enabling a contact robot to understand its human user's control in real-time and integrate their reference trajectory information into its own sensory feedback to improve the tracking performance. The human's control is formulated as a feedback controller with unknown control gains and desired trajectory. An unscented Kalman filter is used to estimate first the control gains and then the desired trajectory. The estimated human's desired trajectory is used as augmented sensory information about the system and combined with the robot's measurement to estimate a reference trajectory. Simulations and an implementation on a robotic interface demonstrate that the reactive control can robustly identify the human user's control, and that the sensory augmentation improves the robot's tracking performance.
引用
收藏
页码:4399 / 4406
页数:8
相关论文
共 50 条
  • [31] Jointly improving parsing and perception for natural language commands through human-robot dialog
    Thomason J.
    Padmakumar A.
    Sinapov J.
    Walker N.
    Jiang Y.
    Yedidsion H.
    Hart J.
    Stone P.
    Mooney R.J.
    Journal of Artificial Intelligence Research, 2020, 67 : 327 - 374
  • [32] Jointly Improving Parsing and Perception for Natural Language Commands through Human-Robot Dialog
    Thomason, Jesse
    Padmakumar, Aishwarya
    Sinapov, Jivko
    Walker, Nick
    Jiang, Yuqian
    Yedidsion, Harel M.
    Hart, Justin W.
    Stone, Peter
    Mooney, Raymond
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2020, 67 : 327 - 374
  • [33] Robust Motion Detection and Tracking for Human-Robot Interaction
    Martinez-Martin, Ester
    del Pobil, Angel P.
    COMPANION OF THE 2017 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'17), 2017, : 401 - 402
  • [34] The Design of Face Recognition and Tracking for Human-Robot Interaction
    Sanjaya, W. S. Mada
    Anggraeni, Dyah
    Zakaria, Kiki
    Juwardi, Atip
    Munawwaroh, Madinatul
    2017 2ND INTERNATIONAL CONFERENCES ON INFORMATION TECHNOLOGY, INFORMATION SYSTEMS AND ELECTRICAL ENGINEERING (ICITISEE): OPPORTUNITIES AND CHALLENGES ON BIG DATA FUTURE INNOVATION, 2017, : 315 - 320
  • [35] Face Recognition and Tracking Framework for Human-Robot Interaction
    Khalifa, Aly
    Abdelrahman, Ahmed A.
    Strazdas, Dominykas
    Hintz, Jan
    Hempel, Thorsten
    Al-Hamadi, Ayoub
    APPLIED SCIENCES-BASEL, 2022, 12 (11):
  • [36] Tracking of Facial Features to Support Human-Robot Interaction
    Pateraki, Maria
    Baltzakis, Haris
    Kondaxakis, Polychronis
    Trahanias, Panos
    ICRA: 2009 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-7, 2009, : 2651 - 2656
  • [37] A Robot Reading Human Gaze: Why Eye Tracking Is Better Than Head Tracking for Human-Robot Collaboration
    Palinko, Oskar
    Rea, Francesco
    Sandini, Giulio
    Sciutti, Alessandra
    2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), 2016, : 5048 - 5054
  • [38] Improving human-robot interaction based on joint attention
    Daglarli, Evren
    Daglarli, Sare Funda
    Gunel, Gulay Oke
    Kose, Hatice
    APPLIED INTELLIGENCE, 2017, 47 (01) : 62 - 82
  • [39] Autonomy mode suggestions for improving human-robot interaction
    Baker, M
    Yanco, HA
    2004 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN & CYBERNETICS, VOLS 1-7, 2004, : 2948 - 2953
  • [40] Improving human-robot interaction based on joint attention
    Evren Dağlarlı
    Sare Funda Dağlarlı
    Gülay Öke Günel
    Hatice Köse
    Applied Intelligence, 2017, 47 : 62 - 82