Real-Time Human-Robot Communication for Manipulation Tasks in Partially Observed Environments

被引:2
|
作者
Arkin, Jacob [1 ]
Paul, Rohan [2 ]
Park, Daehyung [2 ]
Roy, Subhro [2 ]
Roy, Nicholas [2 ]
Howard, Thomas M. [1 ]
机构
[1] Univ Rochester, Rochester, NY 14627 USA
[2] MIT, Cambridge, MA 02139 USA
关键词
LANGUAGE;
D O I
10.1007/978-3-030-33950-0_39
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
In human teams, visual and auditory cues are often used to communicate information about the task and/or environment that may not otherwise be directly observable. Analogously, robots that primarily rely on visual sensors cannot directly observe some attributes of objects that may be necessary for reference resolution or task execution. The experiments in this paper address natural language interaction in human-robot teams for tasks where multi-modal (e.g. visual, auditory, haptic, etc) observations are necessary for robust execution. We present a probabilistic model, verified through physical experiments, that allows robots to acquire knowledge about the latent aspects of the workspace through language and physical interaction in an efficient manner. The model's effectiveness is demonstrated on a mobile and a stationary manipulator in real-world scenarios by following instructions under partial knowledge of object states in the environment.
引用
收藏
页码:448 / 460
页数:13
相关论文
共 50 条
  • [1] Real-Time Hand Posture Recognition for Human-Robot Interaction Tasks
    Haile Hernandez-Belmonte, Uriel
    Ayala-Ramirez, Victor
    [J]. SENSORS, 2016, 16 (01)
  • [2] Predicting the Target in Human-Robot Manipulation Tasks
    Hamandi, Mahmoud
    Hatay, Emre
    Fazli, Pooyan
    [J]. SOCIAL ROBOTICS, ICSR 2018, 2018, 11357 : 580 - 587
  • [3] Real-time safety for human-robot interaction
    Kulic, D
    Croft, EA
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2006, 54 (01) : 1 - 12
  • [4] Real-time safety for human-robot interaction
    Kulic, D
    Croft, EA
    [J]. 2005 12TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS, 2005, : 719 - 724
  • [5] Real-time acoustic source localization in noisy environments for human-robot multimodal interaction
    Trifa, Vlad M.
    Koene, Ansgar
    Moren, Jan
    Cheng, Gordon
    [J]. 2007 RO-MAN: 16TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, VOLS 1-3, 2007, : 392 - +
  • [6] Attentional Human-Robot Interaction in Simple Manipulation Tasks
    Burattini, Ernesto
    Finzi, Alberto
    Rossi, Silvia
    Staffa, Mariacarla
    [J]. HRI'12: PROCEEDINGS OF THE SEVENTH ANNUAL ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2012, : 129 - 130
  • [7] Real-time human motion analysis for human-robot interaction
    Molina-Tanco, L
    Bandera, JP
    Marfil, R
    Sandoval, F
    [J]. 2005 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2005, : 1808 - 1813
  • [8] Motion planning for human-robot interaction in manipulation tasks
    Esteves, Claudia
    Arechavaleta, Gustavo
    Laumond, Jean-Paul
    [J]. 2005 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATIONS, VOLS 1-4, CONFERENCE PROCEEDINGS, 2005, : 1766 - 1771
  • [9] Real-time Framework for Multimodal Human-Robot Interaction
    Gast, Juergen
    Bannat, Alexander
    Rehrl, Tobias
    Wallhoff, Frank
    Rigoll, Gerhard
    Wendt, Cornelia
    Schmidt, Sabrina
    Popp, Michael
    Faerber, Berthold
    [J]. HSI: 2009 2ND CONFERENCE ON HUMAN SYSTEM INTERACTIONS, 2009, : 273 - 280
  • [10] Shared Control for Human-Robot Cooperative Manipulation Tasks
    Petric, Tadej
    Cevzar, Misel
    Babic, Jan
    [J]. ADVANCES IN SERVICE AND INDUSTRIAL ROBOTICS, 2018, 49 : 787 - 796