Bootstrapping Humanoid Robot Skills by Extracting Semantic Representations of Human-like Activities from Virtual Reality

被引:0
|
作者
Ramirez-Amaro, Karinne [1 ]
Inamura, Tetsunari [2 ]
Dean-Leon, Emmanuel [1 ]
Beetz, Michael [3 ]
Cheng, Gordon [1 ]
机构
[1] Tech Univ Munich, Inst Cognit Syst, Fac Elect Engn, D-80290 Munich, Germany
[2] Natl Inst Informat, Tokyo, Japan
[3] Univ Bremen, Inst Artificial Intelligence, D-28359 Bremen, Germany
关键词
D O I
暂无
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Advancements in Virtual Reality have enabled well-defined and consistent virtual environments that can capture complex scenarios, such as human everyday activities. Additionally, virtual simulators (such as SIGVerse) are designed to be user-friendly mechanisms between virtual robots/agents and real users allowing a better interaction. We envision such rich scenarios can be used to train robots to learn new behaviors specially in human everyday activities where a diverse variability can be found. In this paper, we present a multi-level framework that is capable to use different input sources such as cameras and virtual environments to understand and execute the demonstrated activities. Our presented framework first obtains the semantic models of human activities from cameras, which are later tested using the SIGVerse virtual simulator to show new complex activities (such as, cleaning the table) using a virtual robot. Our introduced framework is integrated on a real robot, i.e. an iCub, which is capable to process the signals from the virtual environment to then understand the activities performed by the observed robot. This was realized through the use of previous knowledge and experiences that the robot has learned from observing humans activities. Our results show that our framework was able to extract the meaning of the observed motions with 80 % accuracy of recognition by obtaining the objects relationships given the current context via semantic representations to extract high-level understanding of those complex activities even when they represent different behaviors.
引用
收藏
页码:438 / 443
页数:6
相关论文
共 50 条
  • [1] Transferring skills to humanoid robots by extracting semantic representations from observations of human activities
    Ramirez-Amaro, Karinne
    Beetz, Michael
    Cheng, Gordon
    ARTIFICIAL INTELLIGENCE, 2017, 247 : 95 - 118
  • [2] Human-like Humanoid Robot Posture Control
    Zebenay, M.
    Lippi, V.
    Mergener, T.
    ICIMCO 2015 PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, VOL. 2, 2015, : 304 - 309
  • [3] A new human-like walking for the humanoid robot Romeo
    Kalouguine, A.
    De-Leon-Gomez, V
    Chevallereau, C.
    Dalibard, S.
    Aoustin, Y.
    MULTIBODY SYSTEM DYNAMICS, 2021, 53 (04) : 411 - 434
  • [4] A new human-like walking for the humanoid robot Romeo
    A. Kalouguine
    V. De-León-Gómez
    C. Chevallereau
    S. Dalibard
    Y. Aoustin
    Multibody System Dynamics, 2021, 53 : 411 - 434
  • [5] Control to realize human-like walking of a biped humanoid robot
    Lim, H
    Yamamoto, Y
    Takanishi, A
    SMC 2000 CONFERENCE PROCEEDINGS: 2000 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN & CYBERNETICS, VOL 1-5, 2000, : 3271 - 3276
  • [6] A Human-Like Approach Towards Humanoid Robot Footstep Planning
    Ayaz, Yasar
    Konno, Atsushi
    Munawar, Khalid
    Tsujita, Teppei
    Komizunai, Shunsuke
    Uchiyama, Masaru
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2011, 8 (04): : 98 - 109
  • [7] Human-like Walking Patterns with Pelvic Rotation for a Humanoid Robot
    Zhang, Wen
    Huang, Qiang
    Yu, Zhangguo
    Huang, Gao
    Chen, Xuechao
    Li, Jing
    Ma, Gan
    Meng, Libo
    Liu, Yan
    Zhang, Si
    Zhang, Weimin
    Gao, Junyao
    2014 11TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION (WCICA), 2014, : 1887 - 1892
  • [8] Human-like Control Design and Human Robot Skills Transfer
    Yang, Chenguang
    2019 25TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATION AND COMPUTING (ICAC), 2019, : 619 - 619
  • [9] Designing Human-like Behaviors for Anthropomorphic Arm in Humanoid Robot NAO
    Wei, Yuan (tsubasafx@foxmail.com), 1600, Cambridge University Press (38):
  • [10] Designing Human-like Behaviors for Anthropomorphic Arm in Humanoid Robot NAO
    Wei, Yuan
    Zhao, Jing
    ROBOTICA, 2020, 38 (07) : 1205 - 1226