Observation Timing Planning Method for Sequential Images of Autonomous Navigation Based on Observability

被引:0
|
作者
Li J. [1 ]
Wang D. [1 ]
Dong T. [1 ]
Li M. [2 ]
Xu C. [2 ]
Fu F. [3 ]
机构
[1] Beijing Institute of Spacecraft System Engineering, Beijing
[2] Beijing Institute of Control Engineering, Beijing
[3] School of Aeronautics and Astronautics, SunYat-sen University, Shenzhen
来源
Yuhang Xuebao/Journal of Astronautics | 2023年 / 44卷 / 03期
关键词
Autonomous navigation; Observability; Planetary landing; Sequential image;
D O I
10.3873/j.issn.1000-1328.2023.03.010
中图分类号
学科分类号
摘要
To reduce the computational burden of image processing during the planetary landing autonomous navigation, the observability of sequential image-based autonomous navigation is analyzed, and an observation timing planning method is proposed. The minimum number of the observations to make the state observable in unknown environments is obtained from the observability analysis, which is the boundary condition for switching landmarks. On this basis, the optimal observation time interval is obtained by optimizing the constructed depth estimation error model, so that the observation timing can be planned adaptively and the number of image processing can be reduced. The simulation results verify the correctness of the observability analysis results and the effectiveness of the proposed observation timing planning method, which effectively reduces the number of landmark observations by 45. 9% without significantly affecting the navigation accuracy compared with the observing landmarks at each sampling moment. It is shown that the proposed method can effectively reduce the computational burden of the onboard image processing algorithm and greatly improve the autonomous navigation capability of the planetary lander in unknown environments based on sequential images. © 2023 China Spaceflight Society. All rights reserved.
引用
收藏
页码:411 / 421
页数:10
相关论文
共 19 条
  • [1] XU Chao, WANG Dayi, HUANG Xiangyu, Relative navigation for planetary landing using stereo vision measurements, Journal of Astronautics, 37, 7, pp. 802-810, (2016)
  • [2] WANG Dayi, HUANG Xiangyu, GUAN Yifeng, Et al., Research on the autonomous navigation based on measurement-updated IMU for lunar soft landing, Journal of Astronautics, 28, 6, pp. 1544-1549, (2007)
  • [3] CUIH, Vision-aided inertial navigation for pinpoint planetary landing, Aerospace Science and Technology, 11, 6, pp. 499-506, (2007)
  • [4] JOHNSON A, WILLSON R, CHENG Y, Et al., Design through operation of an image-based velocity estimation system for Mars landing, International Journal of Computer Vision, 74, 3, pp. 319-341, (2007)
  • [5] SHIRAKAWA K, MORITAH UOM, Et al., Accurate landmark tracking for navigating Hayabusa prior to final descent, Spaceflight Mechanics 2006-AAS/AIAA Space Flight Mechanics Meeting Tampa USA, (2006)
  • [6] HESCH J A, KOTTAS D G, BOWMAN S L, Et al., Consistency analysis and improvement of vision-aided inertial navigation, IEEE Transactions on Robotics, 30, 1, pp. 158-176, (2013)
  • [7] HESCH J A, KOTTAS D G, BOWMAN S L, Et al., Camera-IMU-based localization: Observability analysis and consistency improvement, The International Journal of Robotics Research, 33, 1, pp. 182-201, (2014)
  • [8] MARTINELLI A., Vision and IMU data fusion: Closed-form solutions for attitude speed, absolute scale, and bias determination [J], IEEE Transactions on Robotics s, 28, 1, pp. 44-60, (2011)
  • [9] FENG G, HUANG X., Observability analysis of navigation system using point-based visual and inertial sensors [J], Optik, 125, 3, pp. 1346-1353, (2014)
  • [10] SUSHERLANDK T, THOMPSON W B., Inexact navigation, IEEE International Conference on Robotics and Automation, (1993)