A Wearable Robotic Device for Assistive Navigation and Object Manipulation

被引:2
|
作者
Jin, Lingqiu [1 ]
Zhang, He [1 ]
Ye, Cang [1 ]
机构
[1] Virginia Commonwealth Univ, Comp Sci Dept, Richmond, VA 23284 USA
关键词
ROBUST;
D O I
10.1109/IROS51168.2021.9636126
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents a hand-worn assistive device to assist a visually impaired person with object manipulation. The device uses a Google Pixel 3 as the computational platform, a Structure Core (SC) sensor for perception, a speech interface, and a haptic interface for human-device interaction. W-ROMA is intended to assist a visually impaired person to locate a target object (nearby or afar) and guide the user to move towards and eventually take a hold of the object. To achieve this objective, three functions, including object detection, wayfinding, and motion guidance, are developed. Object detection locates the target object's position if it falls within the camera's field of view. Wayfinding enables the user to approach the object. The haptic/speech interface guides the user to move close to the object and then guides the hand to reach the object. A new visual-inertial odometery (VIO), called RGBD-VIO, is devised to accurately estimate the device's pose (position and orientation), which is then used to generate the motion command to guide the user and his/her hand to reach the object. Experimental results demonstrate that RGBD-VIO outperforms the state-of-the-art VIO methods in 6-DOF device pose estimation and the device is effective in assistive object manipulation.
引用
收藏
页码:765 / 770
页数:6
相关论文
共 50 条
  • [1] A Wearable Robotic Object Manipulation Aid for the Visually Impaired
    Liu, Xiaoping
    Zhang, He
    Jin, Lingqiu
    Ye, Cang
    [J]. 2018 IEEE 1ST INTERNATIONAL CONFERENCE ON MICRO/NANO SENSORS FOR AI, HEALTHCARE, AND ROBOTICS (NSENS), 2018, : 5 - 9
  • [2] Development of a Wearable Assistive Soft Robotic Device for Elbow Rehabilitation
    Oguntosin, Victoria
    Harwin, William S.
    Kawamura, Sadao
    Nasuto, Slawomir J.
    Hayashi, Yoshikatsu
    [J]. PROCEEDINGS OF THE IEEE/RAS-EMBS INTERNATIONAL CONFERENCE ON REHABILITATION ROBOTICS (ICORR 2015), 2015, : 747 - 752
  • [3] Object Manipulation for Assistive Robots
    Dragoi, Marius
    Mocanu, Irina
    Cramariuc, Oana
    [J]. 2021 INTERNATIONAL CONFERENCE ON E-HEALTH AND BIOENGINEERING (EHB 2021), 9TH EDITION, 2021,
  • [4] Grasp Detection for Assistive Robotic Manipulation
    Jain, Siddarth
    Argall, Brenna
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2016, : 2015 - 2021
  • [5] ROBOTIC ASSISTIVE DEVICE FOR PHLEBOTOMY
    Carvalho, Paulo
    Kesari, Anurag
    Weaver, Sean
    Flaherty, Patrick
    Fischer, Gregory S.
    [J]. INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2015, VOL 3, 2016,
  • [6] SOFT WEARABLE DELTOID ASSISTIVE DEVICE
    Arellano, Francisco Javier Lopez
    Gandhi, Sushrut
    Patil, Dhiraj
    Roquemore, Bryan
    Maruyama, Trent
    Polygoinos, Panagiotis
    [J]. 2019 DESIGN OF MEDICAL DEVICES CONFERENCE, 2019,
  • [7] Human-Robot Interaction for Assisted Object Grasping by a Wearable Robotic Object Manipulation Aid for the Blind
    Jin, Lingqiu
    Zhang, He
    Shen, Yantao
    Ye, Cang
    [J]. PROCEEDINGS OF THE 2020 IEEE INTERNATIONAL CONFERENCE ON HUMAN-MACHINE SYSTEMS (ICHMS), 2020, : 548 - 553
  • [8] Perception of cloth in assistive robotic manipulation tasks
    Jimenez, Pablo
    Torras, Carme
    [J]. NATURAL COMPUTING, 2020, 19 (02) : 409 - 431
  • [9] Perception of cloth in assistive robotic manipulation tasks
    Pablo Jiménez
    Carme Torras
    [J]. Natural Computing, 2020, 19 : 409 - 431
  • [10] Vision-Based Solutions for Robotic Manipulation and Navigation Applied to Object Picking and Distribution
    Máximo A. Roa-Garzón
    Elena F. Gambaro
    Monika Florek-Jasinska
    Felix Endres
    Felix Ruess
    Raphael Schaller
    Christian Emmerich
    Korbinian Muenster
    Michael Suppa
    [J]. KI - Künstliche Intelligenz, 2019, 33 : 171 - 180