Data fusion for visual tracking dedicated to human-robot interaction

被引:0
|
作者
Brèthes, L [1 ]
Lerasle, F [1 ]
Danès, P [1 ]
机构
[1] CNRS, LAAS, F-31077 Toulouse, France
关键词
D O I
暂无
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
The interaction between men and machines has become an important topic for the robotics community as it can generalize the use of robots. In this context, advanced robots must integrate capabilities to interpret humans motion as well as persons gestures in order to perform tasks for the humans or in synergy with them. The purpose of this paper is to show a real-time system for face/hand tracking and hand gesture recognition in the particle filtering framework. We introduce mechanisms for visual data fusion within particle filtering to develop trackers combining in a novel way color and shape cues, skin blobs or frontal face detection. For the purpose of face tracking, the fusion of modalities based on color and shape allows to avoid noticeable drift, even possible subsequent loss in the worst case. For gestures interpretation, an extension is proposed to achieve in the tracking loop the recognition of the current hand posture and of its motion in the video stream. In both tracking scenarios, the combination or fusion of cues proves to be more robust in cluttered environments than any of the cues individually. The global performances of the proposed trackers and future works are also discussed.
引用
收藏
页码:2075 / 2080
页数:6
相关论文
共 50 条
  • [1] Visual tracking of silhouettes for human-robot interaction
    Menezes, P
    Brèthes, L
    Lerasle, F
    Danès, P
    Dias, J
    [J]. PROCEEDINGS OF THE 11TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS 2003, VOL 1-3, 2003, : 971 - 976
  • [2] Visual Servoing Path Tracking for Safe Human-Robot Interaction
    Garcia, G. J.
    Corrales, J. A.
    Pomares, J.
    Candelas, F. A.
    Torres, F.
    [J]. 2009 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS, VOLS 1 AND 2, 2009, : 661 - 666
  • [3] Visual human-robot interaction
    Heinzmann, J
    Zelinsky, A
    [J]. 2001 INTERNATIONAL WORKSHOP ON BIO-ROBOTICS AND TELEOPERATION, PROCEEDINGS, 2001, : 113 - 118
  • [4] Sensor Fusion Based Human Detection and Tracking System for Human-Robot Interaction
    Ong, Kai Siang
    Hsu, Yuan Han
    Fu, Li Chen
    [J]. 2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2012, : 4835 - 4840
  • [5] Visual Surveillance for Human-Robot Interaction
    Martinez-Martin, Ester
    del Pobil, Angel P.
    [J]. PROCEEDINGS 2012 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2012, : 3333 - 3338
  • [6] Particle filtering strategies for data fusion dedicated to visual tracking from a mobile robot
    Ludovic Brèthes
    Frédéric Lerasle
    Patrick Danès
    Mathias Fontmarty
    [J]. Machine Vision and Applications, 2010, 21 : 427 - 448
  • [7] Particle filtering strategies for data fusion dedicated to visual tracking from a mobile robot
    Brethes, Ludovic
    Lerasle, Frederic
    Danes, Patrick
    Fontmarty, Mathias
    [J]. MACHINE VISION AND APPLICATIONS, 2010, 21 (04) : 427 - 448
  • [8] Face recognition and tracking for human-robot interaction
    Song, KT
    Chen, WJ
    [J]. 2004 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN & CYBERNETICS, VOLS 1-7, 2004, : 2877 - 2882
  • [9] Tracking Anthropomorphizing Behavior in Human-Robot Interaction
    Fischer, Kerstin
    [J]. ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION, 2022, 11 (01)
  • [10] A Glasses Tracking Interface for Human-Robot Interaction
    Kim, Heon-Hui
    Park, Kwang-Hyun
    [J]. 2014 11TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2014, : 344 - 349