Gesteme-free context-aware adaptation of robot behavior in human-robot cooperation

被引:4
|
作者
Nessi, Federico [1 ]
Beretta, Elisa [1 ,2 ]
Gatti, Cecilia [1 ]
Ferrigno, Giancarlo [1 ]
De Momi, Elena [1 ]
机构
[1] Politecn Milan, Dept Elect Informat & Bioengn, Pzza Leonardo da Vinci 32, I-20133 Milan, Italy
[2] Kuka Roboter GmbH, Zugspitzstr 140, D-86165 Augsburg, Germany
关键词
Activity recognition; Cooperative robotics; Context-awareness; RECOGNITION;
D O I
10.1016/j.artmed.2016.10.001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Background: Cooperative robotics is receiving greater acceptance because the typical advantages provided by manipulators are combined with an intuitive usage. In particular, hands-on robotics may benefit from the adaptation of the assistant behavior with respect to the activity currently performed by the user. A fast and reliable classification of human activities is required, as well as strategies to smoothly modify the control of the manipulator. In this scenario, gesteme-based motion classification is inadequate because it needs the observation of a wide signal percentage and the definition of a rich vocabulary. Objective: In this work, a system able to recognize the user's current activity without a vocabulary of gestemes, and to accordingly adapt the manipulator's dynamic behavior is presented. Methods and material: An underlying stochastic model fits variations in the user's guidance forces and the resulting trajectories of the manipulator's end-effector with a set of Gaussian distribution. The high-level switching between these distributions is captured with hidden Markov models. The dynamic of the KUKA light-weight robot, a torque-controlled manipulator, is modified with respect to the classified activity using sigmoidal-shaped functions. The presented system is validated over a pool of 12 naive users in a scenario that addresses surgical targeting tasks on soft tissue. The robot's assistance is adapted in order to obtain a stiff behavior during activities that require critical accuracy constraint, and higher compliance during wide movements. Both the ability to provide the correct classification at each moment (sample accuracy) and the capability of correctly identify the correct sequence of activity (sequence accuracy) were evaluated. Results: The proposed classifier is fast and accurate in all the experiments conducted (80% sample accuracy after the observation of similar to 450 ms of signal). Moreover, the ability of recognize the correct sequence of activities, without unwanted transitions is guaranteed (sequence accuracy similar to 90% when computed far away from user desired transitions). Finally, the proposed activity-based adaptation of the robot's dynamic does not lead to a not smooth behavior (high smoothness, i.e. normalized jerk score <0.01). Conclusion: The provided system is able to dynamic assist the operator during cooperation in the presented scenario. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:32 / 43
页数:12
相关论文
共 50 条
  • [1] Human-Robot Shared Control for Surgical Robot Based on Context-Aware Sim-to-Real Adaptation
    Zhang, Dandan
    Wu, Zicong
    Chen, Junhong
    Zhu, Ruiqi
    Munawar, Adnan
    Xiao, Bo
    Guan, Yuan
    Su, Hang
    Hong, Wuzhou
    Guo, Yao
    Fischer, Gregory S.
    Lo, Benny
    Yang, Guang-Zhong
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 7694 - 7700
  • [2] A Context-Aware Safety System for Human-Robot Collaboration
    Liu, Hongyi
    Wang, Yuquan
    Ji, Wei
    Wang, Lihui
    [J]. 28TH INTERNATIONAL CONFERENCE ON FLEXIBLE AUTOMATION AND INTELLIGENT MANUFACTURING (FAIM2018): GLOBAL INTEGRATION OF INTELLIGENT MANUFACTURING AND SMART INDUSTRY FOR GOOD OF HUMANITY, 2018, 17 : 238 - 245
  • [3] Context-aware hand gesture interaction for human-robot collaboration in construction
    Wang, Xin
    Veeramani, Dharmaraj
    Dai, Fei
    Zhu, Zhenhua
    [J]. COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2024,
  • [4] Toward a Context-Aware Human-Robot Interaction Framework Based on Cognitive Development
    Quintas, Joao
    Martins, Goncalo S.
    Santos, Luis
    Menezes, Paulo
    Dias, Jorge
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2019, 49 (01): : 227 - 237
  • [5] A cyber-physical context-aware system for coordinating human-robot collaboration
    Nikolakis, Nikolaos
    Sipsas, Konstantinos
    Makris, Sotiris
    [J]. 51ST CIRP CONFERENCE ON MANUFACTURING SYSTEMS, 2018, 72 : 27 - 32
  • [6] Context-aware selection of multi-modal conversational fillers in human-robot dialogues
    Galle, Matthias
    Kynev, Ekaterina
    Monet, Nicolas
    Legras, Christophe
    [J]. 2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, : 317 - 322
  • [7] Context-aware assistance guidance via augmented reality for industrial human-robot collaboration
    Zhou, Zhijun
    Li, Ruifang
    Xu, Wenjun
    Yao, Bitao
    Ji, Zhenrui
    [J]. 2022 IEEE 17TH CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2022, : 1516 - 1521
  • [8] Vision-Based Holistic Scene Understanding for Context-Aware Human-Robot Interaction
    De Magistris, Giorgio
    Caprari, Riccardo
    Castro, Giulia
    Russo, Samuele
    Iocchi, Luca
    Nardi, Daniele
    Napoli, Christian
    [J]. AIXIA 2021 - ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, 13196 : 310 - 325
  • [9] Internal observation and mutual adaptation in human-robot cooperation
    Miyake, Y
    Miyagawa, T
    Tamura, Y
    [J]. 1998 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-5, 1998, : 3685 - 3690
  • [10] Matching robot appearance and behavior to tasks to improve human-robot cooperation
    Goetz, J
    Kiesler, S
    Powers, A
    [J]. RO-MAN 2003: 12TH IEEE INTERNATIONAL WORKSHOP ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, PROCEEDINGS, 2003, : 55 - 60