Sound and visual tracking for humanoid robot

被引:14
|
作者
Okuno, HG
Nakadai, K
Lourens, T
Kitano, H
机构
[1] Kyoto Univ, Grad Sch Informat, Dept Intelligence Sci & Technol, Sakyo Ku, Kyoto 6068501, Japan
[2] Honda Res Inst Japan Co Ltd, Wako, Saitama 3510014, Japan
[3] Sony Comp Sci Labs, Shinagawa Ku, Tokyo, Japan
[4] Japan Sci & Technol Corp, Shibuya Ku, Tokyo 1500001, Japan
关键词
robot audition; audio-visual integration; audio-visual tracking; computational auditory scene analysis;
D O I
10.1023/B:APIN.0000021417.62541.e0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mobile robots capable of auditory perception usually adopt the "stop-perceive-act" principle to avoid sounds made during moving due to motor noise. Although this principle reduces the complexity of the problems involved in auditory processing for mobile robots, it restricts their capabilities of auditory processing. In this paper, sound and visual tracking are investigated to compensate each other's drawbacks in tracking objects and to attain robust object tracking. Visual tracking may be difficult in case of occlusion, while sound tracking may be ambiguous in localization due to the nature of auditory processing. For this purpose, we present an active audition system for humanoid robot. The audition system of the highly intelligent humanoid requires localization of sound sources and identification of meanings of the sound in the auditory scene. The active audition reported in this paper focuses on improved sound source tracking by integrating audition, vision, and motor control. Given the multiple sound sources in the auditory scene, SIG the humanoid actively moves its head to improve localization by aligning microphones orthogonal to the sound source and by capturing the possible sound sources by vision. The system adaptively cancels motor noises using motor control signals. The experimental result demonstrates the effectiveness of sound and visual tracking.
引用
收藏
页码:253 / 266
页数:14
相关论文
共 50 条
  • [1] Sound and Visual Tracking for Humanoid Robot
    Hiroshi G. Okuno
    Kazuhiro Nakadai
    Tino Lourens
    Hiroaki Kitano
    Applied Intelligence, 2004, 20 : 253 - 266
  • [2] Visual Tracking Control of Humanoid Robot
    Juang, Li-Hong
    Zhang, Jian-Sen
    IEEE ACCESS, 2019, 7 : 29213 - 29222
  • [3] Proprioceptive Visual Tracking of a Humanoid Robot Head Motion
    Peixoto, Joao
    Santos, Vitor
    Silva, Filipe
    IMAGE ANALYSIS AND RECOGNITION (ICIAR 2016), 2016, 9730 : 503 - 511
  • [4] Visual servoing based positioning and object tracking on humanoid robot
    Bombile, Michael
    Lecture Notes in Electrical Engineering, 2015, 312 : 19 - 27
  • [5] Identical visual cues based tracking system of humanoid robot
    Rajpar, Altaf Hussain
    Keerio, Mohammad Usman
    Chandio, Abdul Fattah
    Kawaja, Attaullah
    SECOND INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES 2006, PROCEEDINGS, 2006, : 643 - 649
  • [6] Social interaction of humanoid robot based on audio-visual tracking
    Okuno, HG
    Nakadai, K
    Kitano, H
    DEVELOPMENTS IN APPLIED ARTIFICAIL INTELLIGENCE, PROCEEDINGS, 2002, 2358 : 725 - 735
  • [7] Motion Control Strategy for Redundant Visual Tracking Mechanism of a Humanoid Robot
    Wan, Minhong
    Zhang, Huiwen
    Fu, Mingliang
    Zhou, Weijia
    2016 8TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN-MACHINE SYSTEMS AND CYBERNETICS (IHMSC), VOL. 2, 2016, : 156 - 159
  • [8] Eye Gaze Tracking for a Humanoid Robot
    Palinko, Oskar
    Rea, Francesco
    Sandini, Giulio
    Sciutti, Alessandra
    2015 IEEE-RAS 15TH INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS), 2015, : 318 - 324
  • [9] Hand tracking with a humanoid robot head
    Ambrosch Roland
    Malisa, Viktorio
    ANNALS OF DAAAM FOR 2007 & PROCEEDINGS OF THE 18TH INTERNATIONAL DAAAM SYMPOSIUM: INTELLIGENT MANUFACTURING & AUTOMATION: FOCUS ON CREATIVITY, RESPONSIBILITY, AND ETHICS OF ENGINEERS, 2007, : 11 - 12
  • [10] Tracking for humanoid robot based on Kinect
    Zhong, Qiubo
    Zhao, Jie
    Tong, Chunya
    2014 INTERNATIONAL CONFERENCE ON MECHATRONICS AND CONTROL (ICMC), 2014, : 1191 - 1194