Control Interface for Hands-free Navigation of Standing Mobility Vehicles based on Upper-Body Natural Movements

被引:8
|
作者
Chen, Yang [1 ]
Paez-Granados, Diego [2 ]
Kadone, Hideki [3 ]
Suzuki, Kenji [4 ,5 ]
机构
[1] Univ Tsukuba, Sch Integrat & Global Majors SIGMA, Tsukuba, Ibaraki, Japan
[2] Ecole Polytech Fed Lausanne EPFL, Learning Algorithms & Syst Lab LASA, Lausanne, Switzerland
[3] Univ Tsukuba Hosp, Ctr Innovat Med & Engn, Tsukuba, Ibaraki, Japan
[4] Univ Tsukuba, Fac Engn, Tsukuba, Ibaraki, Japan
[5] Univ Tsukuba, Ctr Cybern Res, Tsukuba, Ibaraki, Japan
关键词
Medical robotics; human-robot interaction; human-machine interface design; human-in-the-loop control; WHEELCHAIRS;
D O I
10.1109/IROS45743.2020.9340875
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose and evaluate a novel human-machine interface (HMI) for controlling a standing mobility vehicle or person carrier robot, aiming for a hands-free control through upper-body natural postures derived from gaze tracking while walking. We target users with lower-body impairment with remaining upper-body motion capabilities. The developed HMI bases on a sensing array for capturing body postures; an intent recognition algorithm for continuous mapping of body motions to robot control space; and a personalizing system for multiple body sizes and shapes. We performed two user studies: first, an analysis of the required body muscles involved in navigating with the proposed control; and second, an assessment of the HMI compared with a standard joystick through quantitative and qualitative metrics in a narrow circuit task. We concluded that the main user control contribution comes from Rectus Abdominis and Erector Spinae muscle groups at different levels. Finally, the comparative study showed that a joystick still outperforms the proposed HMI in usability perceptions and controllability metrics, however, the smoothness of user control was similar in jerk and fluency. Moreover, users' perceptions showed that hands-free control made it more anthropomorphic, animated, and even safer.
引用
收藏
页码:11322 / 11329
页数:8
相关论文
共 4 条
  • [1] A Novel Head Gesture Based Interface for Hands-free Control of a Robot
    Jackowski, Anja
    Gebhard, Marion
    Graeser, Axel
    2016 IEEE INTERNATIONAL SYMPOSIUM ON MEDICAL MEASUREMENTS AND APPLICATIONS (MEMEA), 2016, : 262 - 267
  • [2] Development and Evaluation of a Hands-Free Motion Cueing Interface for Ground-Based Navigation
    Freiberg, Jacob
    Kitson, Alexandra
    Riecke, Bernhard E.
    2017 IEEE VIRTUAL REALITY (VR), 2017, : 273 - 274
  • [3] Hands-Free EEG-Based Control of a Computer Interface Based on Online Detection of Clenching of Jaw
    Khoshnam, Mahta
    Kuatsjah, Eunice
    Zhang, Xin
    Menon, Carlo
    BIOINFORMATICS AND BIOMEDICAL ENGINEERING, IWBBIO 2017, PT I, 2017, 10208 : 497 - 507
  • [4] Sublime: a Hands-Free Virtual Reality Menu Navigation System Using a High-Frequency SSVEP-based Brain-Computer Interface
    Armengol-Urpi, Alexandre
    Sarma, Sanjay E.
    24TH ACM SYMPOSIUM ON VIRTUAL REALITY SOFTWARE AND TECHNOLOGY (VRST 2018), 2018,