Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints

被引:4
|
作者
Qayyum, Usman [1 ]
Kim, Jonghyuk [2 ]
机构
[1] Ctr Excellence Sci & Appl Technol CESAT, Islamabad 45550, Pakistan
[2] Univ Technol Sydney, Robot Inst, Sydney, NSW 2006, Australia
关键词
integrated inertial navigation; depth camera; directional constraints; epipolar constraints; VISION; FUSION;
D O I
10.3390/s21175913
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
This paper presents a practical yet effective solution for integrating an RGB-D camera and an inertial sensor to handle the depth dropouts that frequently happen in outdoor environments, due to the short detection range and sunlight interference. In depth drop conditions, only the partial 5-degrees-of-freedom pose information (attitude and position with an unknown scale) is available from the RGB-D sensor. To enable continuous fusion with the inertial solutions, the scale ambiguous position is cast into a directional constraint of the vehicle motion, which is, in essence, an epipolar constraint in multi-view geometry. Unlike other visual navigation approaches, this can effectively reduce the drift in the inertial solutions without delay or under small parallax motion. If a depth image is available, a window-based feature map is maintained to compute the RGB-D odometry, which is then fused with inertial outputs in an extended Kalman filter framework. Flight results from the indoor and outdoor environments, as well as public datasets, demonstrate the improved navigation performance of the proposed approach.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Camera aided inertial navigation in poor GPS environments
    George, Michael
    Sukkarieh, Salah
    2007 IEEE AEROSPACE CONFERENCE, VOLS 1-9, 2007, : 850 - 861
  • [2] Directional Ranging for Enhanced Performance of Aided Pedestrian Inertial Navigation
    Wang, Yusheng
    Askari, Sina
    Jao, Chi-Shih
    Shkel, Andrei M.
    2019 6TH IEEE INTERNATIONAL SYMPOSIUM ON INERTIAL SENSORS & SYSTEMS (INERTIAL 2019), 2019,
  • [3] Inertial navigation aided by monocular camera observations of unknown features
    George, Michael
    Sukkarieh, Salah
    PROCEEDINGS OF THE 2007 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-10, 2007, : 3558 - +
  • [4] Camera-aided inertial navigation using epipolar points
    Zachariah, Dave
    Jansson, Magnus
    2010 IEEE-ION POSITION LOCATION AND NAVIGATION SYMPOSIUM PLANS, 2010, : 697 - 703
  • [5] Realtime Implementation of Visual-aided Inertial Navigation Using Epipolar Constraints
    Nilsson, John-Olof
    Zachariah, Dave
    Jansson, Magnus
    Handel, Peter
    2012 IEEE/ION POSITION LOCATION AND NAVIGATION SYMPOSIUM (PLANS), 2012, : 711 - 718
  • [6] Vision-Aided Inertial Navigation with Line Features and a Rolling-Shutter Camera
    Yu, Hongsheng
    Mourikis, Anastasios I.
    2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2015, : 892 - 899
  • [7] Tightly-coupled vision-aided inertial navigation via trifocal constraints
    Asadi, E.
    Bottasso, C. L.
    2012 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO 2012), 2012,
  • [8] Geomagnetism Aided Inertial Navigation System
    Liu, Ying
    Wu, Meiping
    Hu, Xiaoping
    Xie, Hongwei
    2008 2ND INTERNATIONAL SYMPOSIUM ON SYSTEMS AND CONTROL IN AEROSPACE AND ASTRONAUTICS, VOLS 1 AND 2, 2008, : 777 - 781
  • [9] Inertial navigation aided with GPS information
    Eduardo, N
    Salah, S
    Hugh, DW
    FOURTH ANNUAL CONFERENCE ON MECHATRONICS AND MACHINE VISION IN PRACTICE, PROCEEDINGS, 1997, : 169 - 174
  • [10] Vehicle model aided inertial navigation
    Ma, X
    Sukkarleh, S
    Kim, J
    2003 IEEE INTELLIGENT TRANSPORTATION SYSTEMS PROCEEDINGS, VOLS. 1 & 2, 2003, : 1004 - 1009