Monocular Visual Position and Attitude Estimation Method of a Drogue Based on Coaxial Constraints

被引:8
|
作者
Zhao, Kedong [1 ]
Sun, Yongrong [1 ]
Zhang, Yi [1 ]
Li, Hua [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Automat Engn, Nav Res Ctr, Nanjing 211106, Peoples R China
基金
中国国家自然科学基金;
关键词
monocular vision; aerial drogue; position and attitude estimation; circular feature; duality; VISION; CIRCLES;
D O I
10.3390/s21165673
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
In aerial refueling, there exists deformation of the circular feature on the drogue's stabilizing umbrella to a certain extent, which causes the problem of duality of position estimation by a single circular feature. In this paper, a monocular visual position and attitude estimation method of a drogue is proposed based on the coaxial constraints. Firstly, a procedure for scene recovery from one single circle is introduced. The coaxial constraints of the drogue are proposed and proved to be useful for the duality's elimination by analyzing the matrix of the spatial structure. Furthermore, we came up with our method, which is composed of fitting the parameters of the spatial circles by restoring the 3D points on it, using the two-level coaxial constraints to eliminate the duality, and optimizing the normal vector of the plane where the inner circle is located. Finally, the effectiveness and robustness of the method proposed in this paper are verified, and the influence of the coaxial circle's spatial structure on the method is explored through simulations of and experiments on a drogue model. Under the interference of a large amount of noise, the duality elimination success rate of our method can also be maintained at a level that is more than 10% higher than others. In addition, the accuracy of the normal vector obtained by the fusion algorithm is improved, and the mean angle error is reduced by more than 26.7%.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] Monocular Visual Pose Estimation for Flexible Drogue by Decoupling the Deformation
    Zhao, Kedong
    Sun, Yongrong
    Li, Hua
    Wu, Ling
    Fu, Yulong
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [2] Position and Attitude Estimation Method Integrating Visual Odometer and GPS
    Yang, Yu
    Shen, Qiang
    Li, Jie
    Deng, Zilong
    Wang, Hanyu
    Gao, Xiao
    SENSORS, 2020, 20 (07)
  • [3] A Novel Drogue Pose Estimation Method for Autonomous Aerial Refueling Based on Monocular Vision Sensor
    Zhao, Kedong
    Sun, Yongrong
    Li, Hua
    Fu, Yulong
    Zeng, Qinghua
    IEEE SENSORS JOURNAL, 2022, 22 (23) : 23064 - 23076
  • [4] A novel autonomous aerial refueling drogue detection and pose estimation method based on monocular vision
    Ma, Yuebo
    Zhao, Rujin
    Liu, Enhai
    Zhang, Zhuang
    Yan, Kun
    MEASUREMENT, 2019, 136 : 132 - 142
  • [5] Monocular recognition measurement method based on the geometric model of the drogue
    Jian, Wang
    AOPC 2020: DISPLAY TECHNOLOGY; PHOTONIC MEMS, THZ MEMS, AND METAMATERIALS; AND AI IN OPTICS AND PHOTONICS, 2020, 11565
  • [6] A Method to Estimate Relative Position and Attitude of Cooperative UAVs Based on Monocular Vision
    Zhao, Hongbo
    Wu, Sentang
    2018 IEEE CSAA GUIDANCE, NAVIGATION AND CONTROL CONFERENCE (CGNCC), 2018,
  • [7] A Device Pose Estimation Method Based on Monocular Visual Odometer
    Lin, Yongkang
    Yang, Yusi
    Lin, Lan
    2019 PHOTONICS & ELECTROMAGNETICS RESEARCH SYMPOSIUM - FALL (PIERS - FALL), 2019, : 2109 - 2113
  • [8] A Vehicle Monocular Ranging Method Based on Camera Attitude Estimation and Distance Estimation Networks
    Liu, Jun
    Xu, Duo
    WORLD ELECTRIC VEHICLE JOURNAL, 2024, 15 (08):
  • [9] Relative Position and Attitude Estimation Method Based on Antenna Arrays
    Wang, Jiao
    Yuan, Jianping
    Zhang, Ruonan
    2018 IEEE AEROSPACE CONFERENCE, 2018,
  • [10] An Attitude Estimation Method Based on Monocular Vision and Inertial Sensor Fusion for Indoor Navigation
    Wang, Zhe
    Li, Xisheng
    Zhang, Xiaojuan
    Bai, Yanru
    Zheng, Chengcai
    IEEE SENSORS JOURNAL, 2021, 21 (23) : 27051 - 27061