POSE ESTIMATION FOR NON-COOPERATIVE SPACECRAFT RENDEZVOUS USING NEURAL NETWORKS

被引:0
|
作者
Sharma, Sumant [1 ]
D'Amico, Simone [1 ]
机构
[1] Stanford Univ, Dept Aeronaut & Astronaut, 496 Lomita Mall, Stanford, CA 94305 USA
关键词
D O I
暂无
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
This work introduces the Spacecraft Pose Network (SPN) for on-board estimation of the pose, i.e., the relative position and attitude, of a known non-cooperative spacecraft using monocular vision. In contrast to other state-of-the-art pose estimation approaches for spaceborne applications, the SPN method does not require the formulation of hand-engineered features and only requires a single grayscale image to determine the pose of the spacecraft relative to the camera. The SPN method uses a Convolutional Neural Network (CNN) with three branches to solve the problems of relative attitude and relative position estimation. The first branch of the CNN bootstraps a state-of-the-art object detection algorithm to detect a 2D bounding box around the target spacecraft in the input image. The region inside the 2D bounding box is then used by the other two branches of the CNN to determine the relative attitude by initially classifying the input region into discrete coarse attitude labels before regressing to a finer estimate. The SPN method then uses a novel Gauss-Newton algorithm to estimate the relative position by using the constraints imposed by the detected 2D bounding box and the estimated relative attitude. The secondary contribution of this work is the generation of the Spacecraft PosE Estimation Dataset (SPEED), which is used to train and evaluate the performance of the SPN method. SPEED consists of synthetic as well as actual camera images of a mock-up of the Tango spacecraft from the PRISMA mission. The synthetic images are created by fusing OpenGL-based renderings of the spacecraft's 3D model with actual images of the Earth captured by the Himawari-8 meteorological satellite. The actual camera images are created using a 7 degrees-of-freedom robotic arm, which positions and orients a vision-based sensor with respect to a full-scale mock-up of the Tango spacecraft. Custom illumination devices simulate the Earth albedo and Sun light with high fidelity to emulate the illumination conditions present in space. The SPN method, trained only on synthetic images, produces degree-level relative attitude error and cm-level relative position errors when evaluated on the actual camera images not used during training.
引用
收藏
页码:3527 / 3546
页数:20
相关论文
共 50 条
  • [41] Using consecutive point clouds for pose and motion estimation of tumbling non-cooperative target
    Li, Yipeng
    Wang, Yunpeng
    Xie, Yongchun
    [J]. ADVANCES IN SPACE RESEARCH, 2019, 63 (05) : 1576 - 1587
  • [42] Non-linear Dynamics Method to Angles-Only Navigation for Non-cooperative Rendezvous of Spacecraft
    Du, Ronghua
    Liao, Wenhe
    Zhang, Xiang
    [J]. Transactions of Nanjing University of Aeronautics and Astronautics, 2022, 39 (04) : 400 - 414
  • [43] Evaluation of Head Pose Estimation Methods for a Non-cooperative Biometric System
    Wlodarczyk, Michal
    Kacperski, Damian
    Krotewicz, Pawel
    Grabowski, Kamil
    [J]. PROCEEDINGS OF THE 23RD INTERNATIONAL CONFERENCE ON MIXED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS (MIXDES 2016), 2016, : 394 - 398
  • [44] Non-cooperative pose estimation for cubesat based on point set registration
    [J]. Qiao, Liyan (qiaoliyan@hit.edu.cn), 1600, Science Press (37):
  • [45] Multi-Feature Fusion Based Relative Pose Adaptive Estimation for On-Orbit Servicing of Non-Cooperative Spacecraft
    Yunhua Wu
    Nan Yang
    Zhiming Chen
    Bing Hua
    [J]. Journal of Harbin Institute of Technology(New series), 2019, 26 (06) : 19 - 30
  • [46] Pose Determination of Non-cooperative Spacecraft based on multi-feature information fusion
    Liu, Hong
    Wang, Zhichao
    Wang, Bin
    Li, Zhiqi
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2013, : 1538 - 1543
  • [47] Adaptive sliding mode disturbance observer-based control for rendezvous with non-cooperative spacecraft
    Zhu, Xiaoyu
    Chen, Junli
    Zhu, Zheng H.
    [J]. ACTA ASTRONAUTICA, 2021, 183 : 59 - 74
  • [48] Efficient pose and motion estimation of non-cooperative target based on LiDAR
    Li, Peng
    Wang, Mao
    Fu, Jinyu
    Zhang, Bing
    [J]. APPLIED OPTICS, 2022, 61 (27) : 7820 - 7829
  • [49] Pose Estimation Method for Non-Cooperative Target Based on Deep Learning
    Deng, Liwei
    Suo, Hongfei
    Jia, Youquan
    Huang, Cheng
    [J]. AEROSPACE, 2022, 9 (12)
  • [50] Hybrid Filter Design for Relative Motion Estimation of Non-cooperative Spacecraft
    Lu, Shan
    Zhang, Shiyuan
    [J]. Yuhang Xuebao/Journal of Astronautics, 2023, 44 (05): : 764 - 773