Fully neuromorphic vision and control for autonomous drone flight

被引:10
|
作者
Paredes-Valles, F. [1 ]
Hagenaars, J. J. [1 ]
Dupeyroux, J. [1 ]
Stroobants, S. [1 ]
Xu, Y. [1 ]
de Croon, G. C. H. E. [1 ]
机构
[1] Delft Univ Technol, Fac Aerosp Engn, Micro Air Vehicle Lab, Delft, Netherlands
关键词
NEURAL-NETWORKS; LOIHI; FLOW;
D O I
10.1126/scirobotics.adi0591
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Biological sensing and processing is asynchronous and sparse, leading to low-latency and energy-efficient perception and action. In robotics, neuromorphic hardware for event-based vision and spiking neural networks promises to exhibit similar characteristics. However, robotic implementations have been limited to basic tasks with low-dimensional sensory inputs and motor actions because of the restricted network size in current embedded neuromorphic processors and the difficulties of training spiking neural networks. Here, we present a fully neuromorphic vision-to-control pipeline for controlling a flying drone. Specifically, we trained a spiking neural network that accepts raw event-based camera data and outputs low-level control actions for performing autonomous vision-based flight. The vision part of the network, consisting of five layers and 28,800 neurons, maps incoming raw events to ego-motion estimates and was trained with self-supervised learning on real event data. The control part consists of a single decoding layer and was learned with an evolutionary algorithm in a drone simulator. Robotic experiments show a successful sim-to-real transfer of the fully learned neuromorphic pipeline. The drone could accurately control its ego-motion, allowing for hovering, landing, and maneuvering sideways-even while yawing at the same time. The neuromorphic pipeline runs on board on Intel's Loihi neuromorphic processor with an execution frequency of 200 hertz, consuming 0.94 watt of idle power and a mere additional 7 to 12 milliwatts when running the network. These results illustrate the potential of neuromorphic sensing and processing for enabling insect-sized intelligent robots.
引用
收藏
页数:18
相关论文
共 50 条
  • [2] Vision-based drone control for autonomous UAV cinematography
    Ioannis Mademlis
    Charalampos Symeonidis
    Anastasios Tefas
    Ioannis Pitas
    Multimedia Tools and Applications, 2024, 83 : 25055 - 25083
  • [3] Vision-based drone control for autonomous UAV cinematography
    Mademlis, Ioannis
    Symeonidis, Charalampos
    Tefas, Anastasios
    Pitas, Ioannis
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (8) : 25055 - 25083
  • [4] Prices for autonomous Drone and Flight Simulator
    不详
    ATP EDITION, 2013, (05): : 6 - 6
  • [5] Autonomous Aerial Mobility Learning for Drone-Taxi Flight Control
    Yun, Won Joon
    Ha, Yoo Jeong
    Jung, Soyi
    Kim, Joongheon
    12TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE (ICTC 2021): BEYOND THE PANDEMIC ERA WITH ICT CONVERGENCE INNOVATION, 2021, : 329 - 332
  • [6] Semi-autonomous flight control of forestry-use drone
    Sato, Masaru
    Iwase, Masami
    2019 58TH ANNUAL CONFERENCE OF THE SOCIETY OF INSTRUMENT AND CONTROL ENGINEERS OF JAPAN (SICE), 2019, : 1232 - 1235
  • [7] Indoor Vision Based Guidance System for Autonomous Drone and Control Application
    Suwansrikham, Parinya
    Singkhamfu, Phudinan
    2017 INTERNATIONAL CONFERENCE ON DIGITAL ARTS, MEDIA AND TECHNOLOGY (ICDAMT): DIGITAL ECONOMY FOR SUSTAINABLE GROWTH, 2017, : 110 - 114
  • [8] Neuromorphic Vision-aided Semi-autonomous System for Prosthesis Control
    Gouveia, E. L.
    Gouveia, E. B.
    Silva, A. N.
    Soares, A. B.
    XXVII BRAZILIAN CONGRESS ON BIOMEDICAL ENGINEERING, CBEB 2020, 2022, : 2289 - 2294
  • [9] Design and implementation of a fully autonomous flight control system for a UAV helicopter
    Peng Kemao
    Dong Miaobo
    Chen Ben M.
    Cai Guowei
    Lum Kai Yew
    Lee Tong H.
    PROCEEDINGS OF THE 26TH CHINESE CONTROL CONFERENCE, VOL 6, 2007, : 662 - +
  • [10] Airbus Achieves Drone In-flight Autonomous Guidance and Control from Tanker Aircraft
    Drubin, Cliff
    MICROWAVE JOURNAL, 2023, 66 (05) : 55 - 55