Intention estimation from gaze and motion features for human-robot shared-control object manipulation

被引:2
|
作者
Belardinelli, Anna [1 ]
Kondapally, Anirudh Reddy [2 ]
Ruiken, Dirk [1 ]
Tanneberg, Daniel [1 ]
Watabe, Tomoki [2 ]
机构
[1] Honda Res Inst EU, Offenbach, Germany
[2] Honda Res & Dev Co Ltd, Innovat Res Excellence, Wako, Saitama, Japan
关键词
EYE-HAND COORDINATION; AUTONOMY CONTROL; RECOGNITION; SYSTEMS;
D O I
10.1109/IROS47612.2022.9982249
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Shared control can help in teleoperated object manipulation by assisting with the execution of the user's intention. To this end, robust and prompt intention estimation is needed, which relies on behavioral observations. Here, an intention estimation framework is presented, which uses natural gaze and motion features to predict the current action and the target object. The system is trained and tested in a simulated environment with pick and place sequences produced in a relatively cluttered scene and with both hands, with possible hand-over to the other hand. Validation is conducted across different users and hands, achieving good accuracy and earliness of prediction. An analysis of the predictive power of single features shows the predominance of the grasping trigger and the gaze features in the early identification of the current action. In the current framework, the same probabilistic model can be used for the two hands working in parallel and independently, while a rule-based model is proposed to identify the resulting bimanual action. Finally, limitations and perspectives of this approach to more complex, full-bimanual manipulations are discussed.
引用
收藏
页码:9806 / 9813
页数:8
相关论文
共 50 条
  • [41] Human Perception of Inertial Mass for Joint Human-Robot Object Manipulation
    Schmidtler, Jonas
    Koerber, Moritz
    [J]. ACM TRANSACTIONS ON APPLIED PERCEPTION, 2018, 15 (03)
  • [42] Reactive Gaze Control for Natural Human-Robot Interactions
    Mohammad, Yasser
    Nishida, Toyoaki
    [J]. 2008 IEEE CONFERENCE ON ROBOTICS, AUTOMATION, AND MECHATRONICS, VOLS 1 AND 2, 2008, : 523 - 530
  • [43] Virtual-Fixture-Based Osteotomy Shared Control: A Framework for Human-Robot Shared Surgical Osteotomy Manipulation
    Tian, Huanyu
    Han, Zhe
    Wang, Yang
    Zhu, Xiaolong
    Zhang, Weijun
    Wang, Zhengjie
    Li, Changsheng
    Duan, Xingguang
    [J]. IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS, 2023, 5 (04): : 945 - 955
  • [44] Human-Robot Interaction System based on MR object manipulation
    Esaki, Hibiki
    Sekiyama, Kosuke
    [J]. 2023 62ND ANNUAL CONFERENCE OF THE SOCIETY OF INSTRUMENT AND CONTROL ENGINEERS, SICE, 2023, : 598 - 603
  • [45] Parameter Identification of an Unknown Object in Human-Robot Collaborative Manipulation
    Jang, Jayoung
    Park, Jong Hyeon
    [J]. 2020 20TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS), 2020, : 1086 - 1091
  • [46] A Bayesian hierarchy for robust gaze estimation in human-robot interaction
    Lanillos, Pablo
    Ferreira, Joao Filipe
    Dias, Jorge
    [J]. INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2017, 87 : 1 - 22
  • [47] Human-in-the-Loop Robot Control for Human-Robot Collaboration: Human Intention Estimation and Safe Trajectory Tracking Control for Collaborative Tasks
    Dani, Ashwin P.
    Salehi, Iman
    Rotithor, Ghananeel
    Trombetta, Daniel
    Ravichandar, Harish
    [J]. IEEE CONTROL SYSTEMS MAGAZINE, 2020, 40 (06): : 29 - 56
  • [48] GazeEMD: Detecting Visual Intention in Gaze-Based Human-Robot Interaction
    Shi, Lei
    Copot, Cosmin
    Vanlanduit, Steve
    [J]. ROBOTICS, 2021, 10 (02)
  • [49] Intention based Control for Physical Human-robot Interaction
    Lyu, Shangke
    Cheah, Chien Chern
    [J]. PROCEEDINGS OF 2018 IEEE INTERNATIONAL CONFERENCE ON REAL-TIME COMPUTING AND ROBOTICS (IEEE RCAR), 2018, : 1 - 6
  • [50] On deformable object handling: Model-based motion planning for human-robot co-manipulation
    Makris, Sotiris
    Kampourakis, Emmanouil
    Andronas, Dionisis
    [J]. CIRP ANNALS-MANUFACTURING TECHNOLOGY, 2022, 71 (01) : 29 - 32