Understanding Instructions on Large Scale for Human-Robot Interaction

被引:0
|
作者
Xie, Jiongkun [1 ]
Chen, Xiaoping [1 ]
机构
[1] Univ Sci & Technol China, Multiagents Syst Lab, Hefei 230026, Peoples R China
关键词
Human-Robot Interaction; Instruction Understanding; Semantic Parsing; Lexicon Propagation; Graph-based Semi-supervised Learning;
D O I
10.1109/WI-IAT.2014.165
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Correctly interpreting human instructions is the first step to human-robot interaction. Previous approaches to semantically parsing the instructions relied on large numbers of training examples with annotation to widely cover all words in a domain. Annotating large enough instructions with semantic forms needs exhaustive engineering efforts. Hence, we propose propagating the semantic lexicon to learn a semantic parser from limited annotations, whereas the parser still has the ability of interpreting instructions on a large scale. We assume that the semantically-close words have the same semantic form based on the fact that human usually uses different words to refer to a same object or task. Our approach softly maps the unobserved words/phrases to the semantic forms learned from the annotated copurs through a metric for knowledge-based lexical similarity. Experiments on the collected instructions showed that the semantic parser learned with lexicon propagation outperformed the baseline. Our approach provides an opportunity for the robots to understand the human instructions on a large scale.
引用
下载
收藏
页码:175 / 182
页数:8
相关论文
共 50 条
  • [21] Advbot: Towards Understanding Human Preference in a Human-Robot Interaction Scenario
    Wong, Clarice J.
    Tay, Yong Ling
    Lew, Lincoln W. C.
    Koh, Hui Fang
    Xiong, Yijing
    Wu, Yan
    2018 15TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV), 2018, : 1305 - 1309
  • [22] Human-robot interaction-oriented video understanding of human actions
    Wang, Bin
    Chang, Faliang
    Liu, Chunsheng
    Wang, Wenqian
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 133
  • [23] Agreeing to Interact: Understanding Interaction as Human-Robot Goal Conflicts
    Sasabuchi, Kazuhiro
    Ikeuchi, Katsushi
    Inaba, Masayuki
    COMPANION OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'18), 2018, : 21 - 28
  • [24] Structured learning for spoken language understanding in human-robot interaction
    Bastianelli, Emanuele
    Castellucci, Giuseppe
    Croce, Danilo
    Basili, Roberto
    Nardi, Daniele
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2017, 36 (5-7): : 660 - 683
  • [25] Understanding human-robot interaction forces: a new mechanical solution
    Pippo, Irene
    Albanese, Giulia Aurora
    Zenzeri, Jacopo
    Torazza, Diego
    Berselli, Giovanni
    INTERNATIONAL JOURNAL OF INTERACTIVE DESIGN AND MANUFACTURING - IJIDEM, 2024, 18 (07): : 4765 - 4774
  • [26] Human-robot interaction and robot control
    Sequeira, Joao
    Ribeiro, Maria Isabel
    ROBOT MOTION AND CONTROL: RECENT DEVELOPMENTS, 2006, 335 : 375 - 390
  • [27] On Interaction Quality in Human-Robot Interaction
    Bensch, Suna
    Jevtic, Aleksandar
    Hellstrom, Thomas
    ICAART: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE, VOL 1, 2017, : 182 - 189
  • [28] Understanding Nonverbal Communication Cues of Human Personality Traits in Human-Robot Interaction
    Zhihao Shen
    Armagan Elibol
    Nak Young Chong
    IEEE/CAA Journal of Automatica Sinica, 2020, 7 (06) : 1465 - 1477
  • [29] Understanding nonverbal communication cues of human personality traits in human-robot interaction
    Shen, Zhihao
    Elibol, Armagan
    Chong, Nak Young
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2020, 7 (06) : 1465 - 1477
  • [30] Human Motion Understanding for Selecting Action Timing in Collaborative Human-Robot Interaction
    Rea, Francesco
    Vignolo, Alessia
    Sciutti, Alessandra
    Noceti, Nicoletta
    FRONTIERS IN ROBOTICS AND AI, 2019, 6