Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface

被引:14
|
作者
Woehle, Lukas [1 ]
Gebhard, Marion [1 ]
机构
[1] Westphalian Univ Appl Sci, Dept Elect Engn & Appl Sci, Grp Sensors & Actuators, D-45877 Gelsenkirchen, Germany
关键词
data fusion; MARG-sensors; hands-free interface; pose estimation; human robot collaboration; robot control in cartesian space; multisensory interface; gaze control;
D O I
10.3390/s21051798
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
This paper presents a lightweight, infrastructureless head-worn interface for robust and real-time robot control in Cartesian space using head- and eye-gaze. The interface comes at a total weight of just 162 g. It combines a state-of-the-art visual simultaneous localization and mapping algorithm (ORB-SLAM 2) for RGB-D cameras with a Magnetic Angular rate Gravity (MARG)-sensor filter. The data fusion process is designed to dynamically switch between magnetic, inertial and visual heading sources to enable robust orientation estimation under various disturbances, e.g., magnetic disturbances or degraded visual sensor data. The interface furthermore delivers accurate eye- and head-gaze vectors to enable precise robot end effector (EFF) positioning and employs a head motion mapping technique to effectively control the robots end effector orientation. An experimental proof of concept demonstrates that the proposed interface and its data fusion process generate reliable and robust pose estimation. The three-dimensional head- and eye-gaze position estimation pipeline delivers a mean Euclidean error of 19.0 +/- 15.7 mm for head-gaze and 27.4 +/- 21.8 mm for eye-gaze at a distance of 0.3-1.1 m to the user. This indicates that the proposed interface offers a precise control mechanism for hands-free and full six degree of freedom (DoF) robot teleoperation in Cartesian space by head- or eye-gaze and head motion.
引用
收藏
页码:1 / 28
页数:28
相关论文
共 50 条
  • [1] Cortical processing of head- and eye-gaze cues guiding joint social attention
    Laube, Inga
    Kamphuis, Simone
    Dicke, Peter W.
    Thier, Peter
    [J]. NEUROIMAGE, 2011, 54 (02) : 1643 - 1653
  • [2] Interface using eye-gaze and tablet input for an avatar robot control in class participation support system
    Mu, Shenglin
    Shibata, Satoru
    Yamamoto, Tomonori
    Obayashi, Haruki
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2023, 111
  • [3] Input Interface Using Eye-Gaze and Blink Information
    Abe, Kiyohiko
    Sato, Hironobu
    Matsuno, Shogo
    Ohi, Shoichi
    Ohyama, Minoru
    [J]. HCI INTERNATIONAL 2015 - POSTERS' EXTENDED ABSTRACTS, PT I, 2015, 528 : 463 - 467
  • [4] Electric wheelchair control using head pose free eye-gaze tracker
    Nguyen, Q. X.
    Jo, S.
    [J]. ELECTRONICS LETTERS, 2012, 48 (13) : 750 - 752
  • [5] Communication Interface with Eye-Gaze and Head Gesture Using Successive DP Matching and Fuzzy Inference
    Hidetoshi Nonaka
    [J]. Journal of Intelligent Information Systems, 2003, 21 : 105 - 112
  • [6] Communication interface with eye-gaze and head gesture using successive DP matching and fuzzy inference
    Nonaka, H
    [J]. JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2003, 21 (02) : 105 - 112
  • [7] Computer display control and interaction using eye-gaze
    Farid, M.
    Murtagh, F.
    Starck, J.L.
    [J]. Journal of the Society for Information Display, 2002, 10 (3 SPEC.) : 289 - 293
  • [8] Easy Eye-Gaze Calibration using a Moving Visual Target in the Head-Free Remote Eye-Gaze Detection System
    Kondou, Yuki
    Ebisawa, Yoshinobu
    [J]. 2008 IEEE INTERNATIONAL CONFERENCE ON VIRTUAL ENVIRONMENTS, HUMAN-COMPUTER INTERFACES AND MEASUREMENT SYSTEMS, 2008, : 145 - 150
  • [9] NAVIGATING THROUGH GOOGLE MAPS USING AN EYE-GAZE INTERFACE SYSTEM
    Putra, Hanif Fermanda
    Ogata, Kohichi
    [J]. INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2022, 18 (02): : 417 - 432
  • [10] Study on Character Input Methods using Eye-gaze Input Interface
    Murata, Atsuo
    Hayashi, Kazuya
    Moriwaka, Makoto
    Hayami, Takehito
    [J]. 2012 PROCEEDINGS OF SICE ANNUAL CONFERENCE (SICE), 2012, : 1402 - 1407