Initiative in Robot Assistance during Collaborative Task Execution

被引:0
|
作者
Baraglia, Jimmy [1 ]
Cakmak, Maya [2 ]
Nagai, Yukie [1 ]
Rao, Rajesh [2 ]
Asada, Minoru [1 ]
机构
[1] Osaka Univ, Dept Adapt Machine Syst, Osaka, Japan
[2] Univ Washington, Dept Comp Sci & Engn, Seattle, WA 98195 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Collaborative robots are quickly gaining momentum in real-world settings. This has motivated many new research questions in human-robot collaboration. In this paper, we address the questions of whether and when a robot should take initiative during joint human-robot task execution. We develop a system capable of autonomously tracking and performing table-top object manipulation tasks with humans and we implement three different initiative models to trigger robot actions. Human-initiated help gives control of robot action timing to the user; robot-initiated reactive help triggers robot assistance when it detects that the user needs help; and robot-initiated proactive help makes the robot help whenever it can. We performed a user study (N=18) to compare these trigger mechanisms in terms of task performance, usage characteristics, and subjective preference. We found that people collaborate best with a proactive robot, yielding better team fluency and high subjective ratings. However, they prefer having control of when the robot should help, rather than working with a reactive robot that only helps when it is needed.
引用
收藏
页码:67 / 74
页数:8
相关论文
共 50 条
  • [1] Building a Bridge with a Robot: A System for Collaborative On-table Task Execution
    Schulz, Ruth
    Kratzer, Philipp
    Toussaint, Marc
    PROCEEDINGS OF THE 5TH INTERNATIONAL CONFERENCE ON HUMAN AGENT INTERACTION (HAI'17), 2017, : 399 - 403
  • [2] Collaborative task execution by a human and an autonomous mobile robot in a teleoperated system
    Kawabata, K
    Ishikawa, T
    Fujii, T
    Asama, H
    Endo, I
    INTEGRATED COMPUTER-AIDED ENGINEERING, 1999, 6 (04) : 319 - 329
  • [3] Learning from Demonstration Facilitates Human-Robot Collaborative Task Execution
    Koskinopoulou, Maria
    Piperakis, Stylimos
    Frahanias, Panos
    ELEVENTH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN ROBOT INTERACTION (HRI'16), 2016, : 59 - 66
  • [4] Collaborative Human-Robot Hierarchical Task Execution with an Activation Spreading Architecture
    Anima, Bashira Akter
    Blankenburg, Janelle
    Zagainova, Mariya
    Hoseini, S. Pourya A.
    Chowdhury, Muhammed Tawfiq
    Feil-Seifer, David
    Nicolescu, Monica
    Nicolescu, Mircea
    SOCIAL ROBOTICS, ICSR 2019, 2019, 11876 : 301 - 310
  • [5] Large language model based collaborative robot system for daily task assistance
    Seunguk Choi
    David Kim
    Myeonggyun Ahn
    Dongil Choi
    JMST Advances, 2024, 6 (3) : 315 - 327
  • [6] Comparison between a Continuous and Proactive Robot Assistance Approach for the Execution of Collaborative Tasks in Nursing Care
    Gliesche, Pascal
    Agraz, Celia Nieto
    Kowalski, Christian
    Hein, Andreas
    2020 8TH IEEE INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI 2020), 2020, : 206 - 213
  • [7] Cognitive control for robot task execution
    Ratanaswasd, P
    Gordon, S
    Dodd, W
    2005 IEEE INTERNATIONAL WORKSHOP ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2005, : 440 - 445
  • [8] Dynamic Bandwidth Allocation for Collaborative Multi-Robot Systems Based on Task Execution Measures
    Slim, Malak
    Daher, Naseem
    Elhajj, Imad H.
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2024, 110 (03)
  • [9] Challenges in Modelling Cooking Task Execution for User Assistance
    Sosnowski, Tomasz
    Stoev, Teodor
    Kirste, Thomas
    Yordanova, Kristina
    PROCEEDINGS OF THE 8TH INTERNATIONAL WORKSHOP ON SENSOR-BASED ACTIVITY RECOGNITION AND ARTIFICIAL INTELLIGENCE, IWOAR 2023, 2023,
  • [10] Robotic Skill Mutation in Robot-to-Robot Propagation During a Physically Collaborative Sawing Task
    Maessen, Rosa E. S.
    Prendergast, J. Micah
    Peternel, Luka
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (10) : 6483 - 6490