Learning motion primitives and annotative texts from crowd-sourcing

被引:0
|
作者
Takano W. [1 ]
机构
[1] The Univ. of Tokyo, Bunkyoku Hongo, Tokyo
来源
ROBOMECH Journal | / 2卷 / 1期
基金
日本学术振兴会;
关键词
Crowd-sourcing; Motion primitives; Natural language;
D O I
10.1186/s40648-014-0022-7
中图分类号
学科分类号
摘要
Humanoidrobots are expected to be integrated into daily life, where a large variety of human actions and language expressions are observed. They need to learn the referential relations between the actions and language, and to understand the actions in the form of language in order to communicate with human partners or to make inference using language. Intensive research on imitation learning of human motions has been performed for the robots that can recognize human activity and synthesize human-like motions, and this research is subsequently extended to integration of motions and language. This research aims at developing robots that understand human actions in the form of natural language. One difficulty comes from handling a large variety of words or sentences used in daily life because it is too time-consuming for researchers to annotate human actions in various expressions. Recent development of information and communication technology gives an efficient process of crowd-sourcing where many users are available to complete a lot of simple tasks. This paper proposes a novel concept of collecting a large training dataset of motions and their descriptive sentences, and of developing an intelligent framework learning relations between the motions and sentences. This framework enables humanoid robots to understand human actions in various forms of sentences. We tested it on recognition of human daily full-body motions, and demonstrated the validity of it. © 2015, Takano; licensee Springer.
引用
下载
收藏
相关论文
共 50 条
  • [41] histoGraph as a Demonstrator for Domain Specific Challenges to Crowd-Sourcing
    Wieneke, Lars
    Duering, Marten
    Croce, Vincenzo
    Novak, Jasminko
    Social Informatics, 2015, 8852 : 469 - 476
  • [43] IP Geolocation with a Crowd-sourcing Broadband Performance Tool
    Lee, Yeonhee
    Park, Heasook
    Lee, Youngseok
    ACM SIGCOMM COMPUTER COMMUNICATION REVIEW, 2016, 46 (01) : 12 - 20
  • [44] Conceptual Model for Crowd-Sourcing Digital Forensic Evidence
    Baror, Stacey O.
    Venter, H. S.
    Kebande, Victor R.
    6TH INTERNATIONAL CONFERENCE ON SMART CITY APPLICATIONS, 2022, 393 : 1085 - 1099
  • [45] Program Boosting: Program Synthesis via Crowd-Sourcing
    Cochran, Robert A.
    D'Antoni, Loris
    Livshits, Benjamin
    Molnar, David
    Veanes, Margus
    ACM SIGPLAN NOTICES, 2015, 50 (01) : 677 - 688
  • [46] The GEP: Crowd-Sourcing Big Data Analysis with Undergraduates
    Elgin, Sarah C. R.
    Hauser, Charles
    Holzen, Teresa M.
    Jones, Christopher
    Kleinschmit, Adam
    Leatherman, Judith
    TRENDS IN GENETICS, 2017, 33 (02) : 81 - 85
  • [47] CROWD-SOURCING PARENTAL PREFERENCE ASSESSMENTS FOR VESICOURETERAL REFLUX
    Dionise, Zachary
    Garcia-Roig, Michael
    Kirsch, Andrew
    Routh, Jonathan
    JOURNAL OF UROLOGY, 2018, 199 (04): : E589 - E590
  • [48] Crowd-sourcing and author submission as alternatives to professional curation
    Karp, Peter D.
    DATABASE-THE JOURNAL OF BIOLOGICAL DATABASES AND CURATION, 2016,
  • [49] Collaboration Trumps Homophily in Urban Mobile Crowd-sourcing
    Kandappu, Thivya
    Misra, Archan
    Tandriansyah, Randy
    CSCW'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, 2017, : 902 - 915
  • [50] Crowd-sourcing Home Energy Efficiency Measurement System
    Son, Young-Sung
    Han, Hyonyung
    Jo, Jun
    Park, Jun-Hee
    2015 INTERNATIONAL CONFERENCE ON ICT CONVERGENCE (ICTC), 2015, : 1272 - 1275