Real-Time Human-Robot Communication for Manipulation Tasks in Partially Observed Environments

被引:2
|
作者
Arkin, Jacob [1 ]
Paul, Rohan [2 ]
Park, Daehyung [2 ]
Roy, Subhro [2 ]
Roy, Nicholas [2 ]
Howard, Thomas M. [1 ]
机构
[1] Univ Rochester, Rochester, NY 14627 USA
[2] MIT, Cambridge, MA 02139 USA
关键词
LANGUAGE;
D O I
10.1007/978-3-030-33950-0_39
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
In human teams, visual and auditory cues are often used to communicate information about the task and/or environment that may not otherwise be directly observable. Analogously, robots that primarily rely on visual sensors cannot directly observe some attributes of objects that may be necessary for reference resolution or task execution. The experiments in this paper address natural language interaction in human-robot teams for tasks where multi-modal (e.g. visual, auditory, haptic, etc) observations are necessary for robust execution. We present a probabilistic model, verified through physical experiments, that allows robots to acquire knowledge about the latent aspects of the workspace through language and physical interaction in an efficient manner. The model's effectiveness is demonstrated on a mobile and a stationary manipulator in real-world scenarios by following instructions under partial knowledge of object states in the environment.
引用
收藏
页码:448 / 460
页数:13
相关论文
共 50 条
  • [41] An integrated approach to real-time mobile robot control in partially known indoor environments
    Seder, M
    Macek, K
    Petrovic, I
    [J]. IECON 2005: THIRTY-FIRST ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, VOLS 1-3, 2005, : 1785 - 1790
  • [42] Interactive evolution of human-robot communication in real world
    Suga, Y
    Ikuma, Y
    Nagao, D
    Sugano, S
    Ogata, T
    [J]. 2005 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2005, : 1482 - 1487
  • [43] Vision-based construction robot for real-time automated welding with human-robot interaction
    Lee, Doyun
    Han, Kevin
    [J]. AUTOMATION IN CONSTRUCTION, 2024, 168
  • [44] Real-time person tracking and pointing gesture recognition for human-robot interaction
    Nickel, K
    Stiefelhagen, R
    [J]. COMPUTER VISION IN HUMAN-COMPUTER INTERACTION, PROCEEDINGS, 2004, 3058 : 28 - 38
  • [45] Real-time orientation-invariant trajectory learning in human-robot interactions
    Luh, JYS
    Hu, SY
    [J]. 1998 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS - PROCEEDINGS, VOLS 1-3: INNOVATIONS IN THEORY, PRACTICE AND APPLICATIONS, 1998, : 918 - 923
  • [46] DESIGN OF A REAL-TIME HUMAN-ROBOT COLLABORATION SYSTEM USING DYNAMIC GESTURES
    Chen, Haodong
    Leu, Ming C.
    Tao, Wenjin
    Yin, Zhaozheng
    [J]. PROCEEDINGS OF THE ASME 2020 INTERNATIONAL MECHANICAL ENGINEERING CONGRESS AND EXPOSITION, IMECE2020, VOL 2B, 2020,
  • [47] SFPD: Simultaneous Face and Person Detection in Real-Time for Human-Robot Interaction
    Fiedler, Marc-Andre
    Werner, Philipp
    Khalifa, Aly
    Al-Hamadi, Ayoub
    [J]. SENSORS, 2021, 21 (17)
  • [48] Real-time motion control of robotic manipulators for safe human-robot coexistence
    Merckaert, Kelly
    Convens, Bryan
    Wu, Chi-ju
    Roncone, Alessandro
    Nicotra, Marco M.
    Vanderborght, Bram
    [J]. ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2022, 73
  • [49] Real-Time Changes to Social Dynamics in Human-Robot Turn-Taking
    Smith, Justin S.
    Chao, Crystal
    Thomaz, Andrea L.
    [J]. 2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2015, : 3024 - 3029
  • [50] A Novel Real-Time Gesture Recognition Algorithm for Human-Robot Interaction on the UAV
    Chen, Bo
    Hua, Chunsheng
    Han, Jianda
    He, Yuqing
    [J]. COMPUTER VISION SYSTEMS, ICVS 2017, 2017, 10528 : 518 - 526