Optic flow aided navigation and 3D scene reconstruction

被引:0
|
作者
Rollason, Malcolm [1 ]
机构
[1] QinetiQ Ltd, Farnborough GU14 0LX, Hants, England
关键词
Aided navigation; optic flow; structure from motion; Parrot; quadrotor;
D O I
10.1117/12.2027498
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
An important enabler for low cost airborne systems is the ability to exploit low cost inertial instruments. An Inertial Navigation System (INS) can provide a navigation solution, when GPS is denied, by integrating measurements from inertial sensors. However, the gyrometer and accelerometer biases of low cost inertial sensors cause compound errors in the integrated navigation solution. This paper describes experiments to establish whether (and to what extent) the navigation solution can be aided by fusing measurements from an on-board video camera with measurements from the inertial sensors. The primary aim of the work was to establish whether optic flow aided navigation is beneficial even when the 3D structure within the observed scene is unknown. A further aim was to investigate whether an INS can help to infer 3D scene content from video. Experiments with both real and synthetic data have been conducted. Real data was collected using an AR Parrot quadrotor. Empirical results illustrate that optic flow provides a useful aid to navigation even when the 3D structure of the observed scene is not known. With optic flow aiding of the INS, the computed trajectory is consistent with the true camera motion, whereas the unaided INS yields a rapidly increasing position error (the data represents similar to 40 seconds, after which the unaided INS is similar to 50 metres in error and has passed through the ground). The results of the Monte Carlo simulation concur with the empirical result. Position errors, which grow as a quadratic function of time when unaided, are substantially checked by the availability of optic flow measurements.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] 3D Scene Reconstruction for Aiding Unmanned Vehicle Navigation
    Diskin, Yakov
    Asari, Vijayan K.
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2013), 2013, : 243 - 248
  • [2] Differential Information Aided 3-D Registration for Accurate Navigation and Scene Reconstruction
    Wu, Jin
    Zhang, Shuyang
    Zhu, Yilong
    Geng, Ruoyu
    Fu, Zhongtao
    Ma, Fulong
    Liu, Ming
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 9249 - 9254
  • [3] 3D Nominal Scene Reconstruction for Object Localization and UAS Navigation
    Han, Xuyang
    Srigrarom, Sutthiphong
    [J]. INTELLIGENT AUTONOMOUS SYSTEMS 16, IAS-16, 2022, 412 : 71 - 84
  • [4] 3D Indoor Scene Reconstruction and Change Detection for Robotic Sensing and Navigation
    Liu, Ruixu
    Asari, Vijayan K.
    [J]. MOBILE MULTIMEDIA/IMAGE PROCESSING, SECURITY, AND APPLICATIONS 2017, 2017, 10221
  • [5] Study on 3D Scene Reconstruction in Robot Navigation using Stereo Vision
    Ann, Nurnajmin Qasrina
    Achmad, M. S. Hendriyawan
    Bayuaji, Luhur
    Daud, M. Razali
    Pebrianti, Dwi
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC CONTROL AND INTELLIGENT SYSTEMS (I2CACIS), 2016, : 72 - 77
  • [6] 3D crime scene reconstruction
    Buck, Ursula
    [J]. FORENSIC SCIENCE INTERNATIONAL, 2019, 304
  • [7] Forensic 3D scene reconstruction
    Little, CQ
    Small, DE
    Peters, RR
    Rigdon, JB
    [J]. 28TH AIPR WORKSHOP: 3D VISUALIZATION FOR DATA EXPLORATION AND DECISION MAKING, 2000, 3905 : 67 - 73
  • [8] 3D Scene Reconstruction and Object Recognition for Indoor Scene
    Shen, Yangping
    Manabe, Yoshitsugu
    Yata, Noriko
    [J]. INTERNATIONAL WORKSHOP ON ADVANCED IMAGE TECHNOLOGY (IWAIT) 2019, 2019, 11049
  • [9] Scene Flow-Based Environment 3D Digitalization for Mobile Robot Navigation
    Li, Xiuzhi
    Jia, Songmin
    Wang, Ke
    Zhao, Liang
    [J]. ADVANCED ROBOTICS, 2012, 26 (13) : 1521 - 1536
  • [10] 3D scene reconstruction method based on image optical flow feedback
    Su, Yulin
    [J]. International Journal of Information and Communication Technology, 2021, 19 (02) : 184 - 200