Integrated vision/inertial navigation method of UAVs in indoor environment

被引:0
|
作者
Wang T. [1 ]
Cai Z. [1 ,2 ]
Wang Y. [1 ,2 ]
机构
[1] School of Automation Science and Electrical Engineering, Beijing University of Aeronautics and Astronautics, Beijing
[2] Aircraft Control Integration National Defense Key Laboratory, Beijing University of Aeronautics and Astronautics, Beijing
关键词
Multi-sensor fusion; Optical flow; ORB features; UAV; Vision navigation;
D O I
10.13700/j.bh.1001-5965.2016.0965
中图分类号
学科分类号
摘要
A new integrated navigation method based on inertial sensor, optical flow and visual odometry is proposed for self-navigation indoor in GPS-denied environment. An ORB optical flow based method is also proposed for estimating real-time three-axis velocity of the UAV. The algorithm improves the traditional pyramid Lucas-Kanade method using sparse optical flow based on feature points. The tracking of feature points is made more accurate by applying forward-backward tracking and random sampling consensus strategies. For position estimation, a visual odometry method with integrated vision/inertial navigation is adopted, which uses the artificial icon method, visual optical flow information and inertial navigation data. Finally, the velocity and position estimations from the proposed method are validated via actual flight test and via comparison with velocity measurement information from a PX4Flow module and a Guidance module and with locating information from movement capture system. © 2018, Editorial Board of JBUAA. All right reserved.
引用
收藏
页码:176 / 186
页数:10
相关论文
共 18 条
  • [1] Shen S.J., Autonomous navigation in complex indoor and outdoor environments with micro aerial vehicles, (2014)
  • [2] Wu Q., Cai Z.H., Wang Y.X., Optical flow and landmark fusion method for UAV indoor navization, Control Theory & Applications, 32, 11, pp. 1511-1517, (2015)
  • [3] Li P., Lambert A., A monocular odometer for a quadrotor using a homogra-phy model and inertial cues, IEEE Conference on Robotics and Biomimetics, pp. 570-575, (2015)
  • [4] Ye C.C., Research on localization and object tracking for the IARC mission7, (2016)
  • [5] Mur-Artal R., Montiel J.M.M., Tardos J.D., ORB-SLAM: A versatile and accurate monocular SLAM system, IEEE Transactions on Robotics, 31, 5, pp. 1147-1163, (2015)
  • [6] Leutenegger S., Furgale P., Rabaud V., Et al., Keyframe-based visual-inertial SLAM using nonlinear optimization, Robotics: Science and Systems, pp. 789-795, (2013)
  • [7] Chao H., Gu Y., Gross J., Et al., A comparative study of optical flow and traditional sensors in UAV navigation, American Control Conference (ACC), pp. 3858-3863, (2013)
  • [8] Mammarella M., Campa G., Fravolini M.L., Et al., Comparing optical flow algorithms using 6-dof motion of real-world rigid objects, IEEE Transactions on, Systems, Man, and Cybernetics, Part C: Applications and Reviews, 42, 6, pp. 1752-1762, (2012)
  • [9] PHANTOM 4 user's manual V1.2
  • [10] Hover Camera 2016