Learning motion primitives and annotative texts from crowd-sourcing

被引:0
|
作者
Takano W. [1 ]
机构
[1] The Univ. of Tokyo, Bunkyoku Hongo, Tokyo
来源
ROBOMECH Journal | / 2卷 / 1期
基金
日本学术振兴会;
关键词
Crowd-sourcing; Motion primitives; Natural language;
D O I
10.1186/s40648-014-0022-7
中图分类号
学科分类号
摘要
Humanoidrobots are expected to be integrated into daily life, where a large variety of human actions and language expressions are observed. They need to learn the referential relations between the actions and language, and to understand the actions in the form of language in order to communicate with human partners or to make inference using language. Intensive research on imitation learning of human motions has been performed for the robots that can recognize human activity and synthesize human-like motions, and this research is subsequently extended to integration of motions and language. This research aims at developing robots that understand human actions in the form of natural language. One difficulty comes from handling a large variety of words or sentences used in daily life because it is too time-consuming for researchers to annotate human actions in various expressions. Recent development of information and communication technology gives an efficient process of crowd-sourcing where many users are available to complete a lot of simple tasks. This paper proposes a novel concept of collecting a large training dataset of motions and their descriptive sentences, and of developing an intelligent framework learning relations between the motions and sentences. This framework enables humanoid robots to understand human actions in various forms of sentences. We tested it on recognition of human daily full-body motions, and demonstrated the validity of it. © 2015, Takano; licensee Springer.
引用
收藏
相关论文
共 50 条
  • [1] Active Learning and Crowd-Sourcing for Machine Translation
    Ambati, Vamshi
    Vogel, Stephan
    Carbonell, Jaime
    [J]. LREC 2010 - SEVENTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2010, : 2169 - 2174
  • [2] Software CROWD-Sourcing
    Naik, Nitin
    [J]. 2017 11TH INTERNATIONAL CONFERENCE ON RESEARCH CHALLENGES IN INFORMATION SCIENCE (RCIS), 2017, : 463 - 464
  • [3] Crowd-Sourcing Creation
    Brunick, Paul
    [J]. FILM COMMENT, 2011, 47 (04) : 42 - 45
  • [4] An Open Source Tool for Crowd-Sourcing the Manual Annotation of Texts
    Drury, Brett
    Cardoso, Paula C. F.
    Valverde-Rebaza, Jorge
    Valejo, Alan
    Pereira, Fabio
    Lopes, Alneu de Andrade
    [J]. COMPUTATIONAL PROCESSING OF THE PORTUGUESE LANGUAGE, 2014, 8775 : 268 - 273
  • [5] Crowd-Sourcing Drug Discovery
    Bagla, Pallava
    [J]. SCIENCE, 2012, 335 (6071) : 909 - 909
  • [6] An Online Learning Approach to Improving the Quality of Crowd-Sourcing
    Liu, Yang
    Liu, Mingyan
    [J]. IEEE-ACM TRANSACTIONS ON NETWORKING, 2017, 25 (04) : 2166 - 2179
  • [7] Crowd-Sourcing for Smart Cities
    Chowdhury, Srinjoy Nag
    Dhawan, Saniya
    Agnihotri, Akshay
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON RECENT TRENDS IN ELECTRONICS, INFORMATION & COMMUNICATION TECHNOLOGY (RTEICT), 2016, : 360 - 365
  • [8] REMOTE SENSING AND CROWD-SOURCING
    Guida, Raffaella
    Brett, Peter T. B.
    Khan, Salman S.
    [J]. 2013 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2013, : 3942 - 3945
  • [9] Crowd-sourcing: Strength in numbers
    Philip Ball
    [J]. Nature, 2014, 506 : 422 - 423
  • [10] Crowd-sourcing prosodic annotation
    Cole, Jennifer
    Mahrt, Timothy
    Roy, Joseph
    [J]. COMPUTER SPEECH AND LANGUAGE, 2017, 45 : 300 - 325