Intention estimation from gaze and motion features for human-robot shared-control object manipulation

被引:2
|
作者
Belardinelli, Anna [1 ]
Kondapally, Anirudh Reddy [2 ]
Ruiken, Dirk [1 ]
Tanneberg, Daniel [1 ]
Watabe, Tomoki [2 ]
机构
[1] Honda Res Inst EU, Offenbach, Germany
[2] Honda Res & Dev Co Ltd, Innovat Res Excellence, Wako, Saitama, Japan
关键词
EYE-HAND COORDINATION; AUTONOMY CONTROL; RECOGNITION; SYSTEMS;
D O I
10.1109/IROS47612.2022.9982249
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Shared control can help in teleoperated object manipulation by assisting with the execution of the user's intention. To this end, robust and prompt intention estimation is needed, which relies on behavioral observations. Here, an intention estimation framework is presented, which uses natural gaze and motion features to predict the current action and the target object. The system is trained and tested in a simulated environment with pick and place sequences produced in a relatively cluttered scene and with both hands, with possible hand-over to the other hand. Validation is conducted across different users and hands, achieving good accuracy and earliness of prediction. An analysis of the predictive power of single features shows the predominance of the grasping trigger and the gaze features in the early identification of the current action. In the current framework, the same probabilistic model can be used for the two hands working in parallel and independently, while a rule-based model is proposed to identify the resulting bimanual action. Finally, limitations and perspectives of this approach to more complex, full-bimanual manipulations are discussed.
引用
收藏
页码:9806 / 9813
页数:8
相关论文
共 50 条
  • [1] Semantic Gaze Labeling for Human-Robot Shared Manipulation
    Aronson, Reuben M.
    Admoni, Henny
    [J]. ETRA 2019: 2019 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2019,
  • [2] Neural Control for Human-Robot Interaction with Human Motion Intention Estimation
    Peng, Guangzhu
    Yang, Chenguang
    Chen, C. L. Philip
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2024, 71 (12) : 1 - 10
  • [3] Human-robot cooperative manipulation with motion estimation
    Maeda, Y
    Hara, T
    Arai, T
    [J]. IROS 2001: PROCEEDINGS OF THE 2001 IEEE/RJS INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4: EXPANDING THE SOCIETAL ROLE OF ROBOTICS IN THE NEXT MILLENNIUM, 2001, : 2240 - 2245
  • [4] Research on Human-robot Shared Control of Throat Swab Sampling Robot Based on Intention Estimation
    Ying-Long Chen
    Fu-Jun Song
    Heng-Fei Yan
    Peng-Yu Zhao
    Yong-Jun Gong
    [J]. International Journal of Control, Automation and Systems, 2024, 22 : 661 - 675
  • [5] Human-Robot Collaboration Based on Motion Intention Estimation
    Li, Yanan
    Ge, Shuzhi Sam
    [J]. IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2014, 19 (03) : 1007 - 1014
  • [6] Research on Human-robot Shared Control of Throat Swab Sampling Robot Based on Intention Estimation
    Chen, Ying-Long
    Song, Fu-Jun
    Yan, Heng-Fei
    Zhao, Peng-Yu
    Gong, Yong-Jun
    [J]. INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2024, 22 (02) : 661 - 675
  • [7] Shared Control for Human-Robot Cooperative Manipulation Tasks
    Petric, Tadej
    Cevzar, Misel
    Babic, Jan
    [J]. ADVANCES IN SERVICE AND INDUSTRIAL ROBOTICS, 2018, 49 : 787 - 796
  • [8] Using Human Motion Estimation for Human-Robot Cooperative Manipulation
    Thobbi, Anand
    Gu, Ye
    Sheng, Weihua
    [J]. 2011 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2011, : 2873 - 2878
  • [9] Human grasp position estimation for human-robot cooperative object manipulation
    Ansari, Ramin Jaberzadeh
    Giordano, Giuseppe
    Sjoberg, Jonas
    Karayiannidis, Yiannis
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2020, 131
  • [10] Estimation of human impedance and motion intention for constrained human-robot interaction
    Yu, Xinbo
    Li, Yanan
    Zhang, Shuang
    Xue, Chengqian
    Wang, Yu
    [J]. NEUROCOMPUTING, 2020, 390 : 268 - 279