Intelligent Vision-based Autonomous Ship Landing of VTOL UAVs

被引:5
|
作者
Lee, Bochan [1 ]
Saj, Vishnu [2 ]
Kalathil, Dileep [2 ]
Benedict, Moble [2 ]
机构
[1] Republ Korea Navy, Gyeryong Si, Chungcheongnam, South Korea
[2] Texas A&M Univ, College Stn, TX 77843 USA
基金
美国国家科学基金会;
关键词
QUADROTOR;
D O I
10.4050/JAHS.68.022010
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
The paper discusses an intelligent vision-based control solution for autonomous tracking and landing of vertical take-off and landing (VTOL) capable unmanned aerial vehicles (UAVs) on ships without utilizing GPS signals. The central idea involves automating the Navy helicopter ship landing procedure where the pilot utilizes the ship as the visual reference for long-range tracking; however, it refers to a standardized visual cue installed on most Navy ships called the "horizon bar" for the final approach and landing phases. This idea is implemented using a uniquely designed nonlinear controller integrated with machine vision. The vision system utilizes machine learning based object detection for long-range ship tracking and classical computer vision for the estimation of aircraft relative position and orientation utilizing the horizon bar during the final approach and landing phases. The nonlinear controller operates based on the information estimated by the vision system and has demonstrated robust tracking performance even in the presence of uncertainties. The developed autonomous ship landing system was implemented on a quad-rotor UAV equipped with an onboard camera, and approach and landing were successfully demonstrated on a moving deck, which imitates realistic ship deck motions. Extensive simulations and flight tests were conducted to demonstrate vertical landing safety, tracking capability, and landing accuracy.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Autonomous Vision-based Target Detection and Safe Landing for UAV
    Mohammed Rabah
    Ali Rohan
    Muhammad Talha
    Kang-Hyun Nam
    Sung Ho Kim
    International Journal of Control, Automation and Systems, 2018, 16 : 3013 - 3025
  • [32] Vision-based Autonomous Landing for Rotorcraft Unmanned Aerial Vehicle
    Bu, Chaovan
    Ai, Yunfeng
    Du, Huajun
    2016 IEEE INTERNATIONAL CONFERENCE ON VEHICULAR ELECTRONICS AND SAFETY (ICVES), 2016, : 77 - 82
  • [33] Vision-based autonomous landing of a quadrotor using a gimbaled camera
    Jiang, Tao
    Lin, Defu
    Song, Tao
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART G-JOURNAL OF AEROSPACE ENGINEERING, 2019, 233 (14) : 5093 - 5106
  • [34] Overview of Landmarks for Autonomous, Vision-Based Landing of Unmanned Helicopters
    Chen, Yong
    Liu, Heng-Li
    IEEE AEROSPACE AND ELECTRONIC SYSTEMS MAGAZINE, 2016, 31 (05) : 14 - 27
  • [35] Vision-Based Autonomous Landing on Unprepared Field With Rugged Surface
    Liu, Zhifa
    Wang, Chunyuan
    Chen, Kejing
    Meng, Wei
    IEEE SENSORS JOURNAL, 2022, 22 (18) : 17914 - 17923
  • [36] Research on the application of vision-based autonomous navigation to the landing of the UAV
    Liu, XH
    Cao, YF
    FIFTH INTERNATIONAL SYMPOSIUM ON INSTRUMENTATION AND CONTROL TECHNOLOGY, 2003, 5253 : 385 - 388
  • [37] Autonomous Vision-based Target Detection and Safe Landing for UAV
    Rabah, Mohammed
    Rohan, Ali
    Talha, Muhammad
    Nam, Kang-Hyun
    Kim, Sung Ho
    INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2018, 16 (06) : 3013 - 3025
  • [38] An Approach Toward Visual Autonomous Ship Board Landing of a VTOL UAV
    Luis Sanchez-Lopez, Jose
    Pestana, Jesus
    Saripalli, Srikanth
    Campoy, Pascual
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2014, 74 (1-2) : 113 - 127
  • [39] An Approach Toward Visual Autonomous Ship Board Landing of a VTOL UAV
    Jose Luis Sanchez-Lopez
    Jesus Pestana
    Srikanth Saripalli
    Pascual Campoy
    Journal of Intelligent & Robotic Systems, 2014, 74 : 113 - 127
  • [40] Vision-based control for helicopter ship landing with handling qualities constraints
    Quang Huy Truong
    Rakotomamonjy, Thomas
    Taghizad, Armin
    Blannic, Jean-Marc
    IFAC PAPERSONLINE, 2016, 49 (17): : 118 - 123