Self-Localization of an Omnidirectional Mobile Robot Based on an Optical Flow Sensor

被引:1
|
作者
Atsushi Sanada
Kazuo Ishii
Tetsuya Yagi
机构
[1] Kyushu Institute of Technology,
[2] Osaka University,undefined
来源
关键词
omnidirectional mobile robot; silicon retina camera; optical flow; FPGA;
D O I
暂无
中图分类号
学科分类号
摘要
An omnidirectional mobile robot has the advantage that three degrees of freedom of motion in a 2D plane can be set independently, and it can thus move in arbitrary directions while maintaining the same heading. Dead reckoning is often used for self-localization using onboard sensors in omnidirectional robots, by means of measuring wheel velocities from motor encoder data, as well as in car-like robots. However, omnidirectional mobile robots can easily slip because of the nature of omni-wheels with multiple free rollers, and dead reckoning will not work if even one wheel is not attached to the ground. An odometry method where the data is not affected by wheel slip must be introduced to acquire high quality self-location data for omnidirectional mobile robots. We describe a method to obtain robot ego-motion using camera images and optical flow calculation, i.e., where the camera is used as a velocity sensor. In this paper, a silicon retina vision camera is introduced as a mobile robot sensor, which has a good dynamic range under various lighting conditions. A Field-Programmable Gate Array (FPGA) optical flow circuit for the silicon retina is also developed to measure ego-motion of the mobile robot. The developed optical flow calculation system is introduced into a small omnidirectional mobile robot and evaluation experiments for the mobile robot ego-motion are carried out. In the experiments, the accuracy of self-location by the dead reckoning and optical flow methods are evaluated by comparison using motion capture. The results show that the correct position is obtained by the optical flow sensor rather than by dead reckoning.
引用
收藏
页码:S172 / S176
相关论文
共 50 条
  • [31] Global ultrasonic system for self-localization of mobile robot
    Yi, SY
    IEICE TRANSACTIONS ON COMMUNICATIONS, 2003, E86B (07) : 2171 - 2177
  • [32] Self-Localization for a Mobile Robot in harsh Outdoor Environments
    Kopfstedt, Thomas
    Beutel, Andreas
    Steurer, Bjoern
    SYMPOSIUM GYRO TECHNOLOGY 2011: INERTIAL SENSORS AND SYSTEMS 2011, 2011,
  • [33] A Self-Localization Method for Stereo Camera on the Mobile Robot
    Lian Xiaofeng
    Zhang Haiqiang
    Liu Zaiwen
    Wang Xiaoyi
    2008 CHINESE CONTROL AND DECISION CONFERENCE, VOLS 1-11, 2008, : 898 - +
  • [34] Semi-autonomous learning of an RFID sensor model for mobile robot self-localization
    Department of Computer Science, University of Tübingen, Tübingen, Germany
    Springer Tracts Adv. Rob., 2008, (273-282):
  • [35] Semi-autonomous learning of an RFID sensor model for mobile robot self-localization
    Vorst, Philipp
    Zell, Andreas
    EUROPEAN ROBOTICS SYMPOSIUM 2008, 2008, 44 : 273 - 282
  • [36] Mobile robot self-localization without explicit landmarks
    Brown, RG
    Donald, BR
    ALGORITHMICA, 2000, 26 (3-4) : 515 - 559
  • [37] Mobile Robot Self-Localization without Explicit Landmarks
    R. G. Brown
    B. R. Donald
    Algorithmica, 2000, 26 : 515 - 559
  • [38] A framework for robust and incremental self-localization of a mobile robot
    Jogan, M
    Artac, M
    Skocaj, D
    Leonardis, A
    COMPUTER VISION SYSTEMS, PROCEEDINGS, 2003, 2626 : 460 - 469
  • [39] Fast landmark tracking and localization algorithm for the mobile robot self-localization
    Yoon, KJ
    Jang, GJ
    Kim, SH
    Kweon, IS
    MOBILE ROBOT TECHNOLOGY, PROCEEDINGS, 2001, : 185 - 190
  • [40] Speed control of a sonar-based mobile robot with considering the self-localization
    Emaru, Takanori
    Tanaka, Kazuo
    Tsuchiya, Takeshi
    2005 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATIONS, VOLS 1-4, CONFERENCE PROCEEDINGS, 2005, : 125 - 130