Fully neuromorphic vision and control for autonomous drone flight

被引:10
|
作者
Paredes-Valles, F. [1 ]
Hagenaars, J. J. [1 ]
Dupeyroux, J. [1 ]
Stroobants, S. [1 ]
Xu, Y. [1 ]
de Croon, G. C. H. E. [1 ]
机构
[1] Delft Univ Technol, Fac Aerosp Engn, Micro Air Vehicle Lab, Delft, Netherlands
关键词
NEURAL-NETWORKS; LOIHI; FLOW;
D O I
10.1126/scirobotics.adi0591
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Biological sensing and processing is asynchronous and sparse, leading to low-latency and energy-efficient perception and action. In robotics, neuromorphic hardware for event-based vision and spiking neural networks promises to exhibit similar characteristics. However, robotic implementations have been limited to basic tasks with low-dimensional sensory inputs and motor actions because of the restricted network size in current embedded neuromorphic processors and the difficulties of training spiking neural networks. Here, we present a fully neuromorphic vision-to-control pipeline for controlling a flying drone. Specifically, we trained a spiking neural network that accepts raw event-based camera data and outputs low-level control actions for performing autonomous vision-based flight. The vision part of the network, consisting of five layers and 28,800 neurons, maps incoming raw events to ego-motion estimates and was trained with self-supervised learning on real event data. The control part consists of a single decoding layer and was learned with an evolutionary algorithm in a drone simulator. Robotic experiments show a successful sim-to-real transfer of the fully learned neuromorphic pipeline. The drone could accurately control its ego-motion, allowing for hovering, landing, and maneuvering sideways-even while yawing at the same time. The neuromorphic pipeline runs on board on Intel's Loihi neuromorphic processor with an execution frequency of 200 hertz, consuming 0.94 watt of idle power and a mere additional 7 to 12 milliwatts when running the network. These results illustrate the potential of neuromorphic sensing and processing for enabling insect-sized intelligent robots.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] Using Competition to Control Congestion in Autonomous Drone Systems
    Manrique, Pedro D.
    Johnson, D. Dylan
    Johnson, Neil F.
    ELECTRONICS, 2017, 6 (02)
  • [42] Fully Actuated Autonomous Flight of Thruster-Tilting Multirotor
    Lee, Seung Jae
    Lee, Dongjae
    Kim, Junha
    Kim, Dabin
    Jang, Inkyu
    Kim, H. Jin
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2021, 26 (02) : 765 - 776
  • [43] CNS Flight Stack for Reproducible, Customizable, and Fully Autonomous Applications
    Scheiber, Martin
    Fornasier, Alessandro
    Jung, Roland
    Bohm, Christoph
    Dhakate, Rohit
    Stewart, Christian
    Steinbrener, Jan
    Weiss, Stephan
    Brommer, Christian
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (04) : 11283 - 11290
  • [44] Autonomous Flight Control for an RC Helicopter
    Camilleri, Michael
    Scerri, Kenneth
    Zammit, Saviour
    2012 16TH IEEE MEDITERRANEAN ELECTROTECHNICAL CONFERENCE (MELECON), 2012, : 391 - 394
  • [45] Learning Agile, Vision-Based Drone Flight: From Simulation to Reality
    Scaramuzza, Davide
    Kaufmann, Elia
    ROBOTICS RESEARCH, ISRR 2022, 2023, 27 : 11 - 18
  • [46] Development of a Control and Vision Interface for an AR.Drone
    Cheema, Prasad
    Luo, Simon
    Gibbens, Peter
    2016 8TH INTERNATIONAL CONFERENCE ON COMPUTER AND AUTOMATION ENGINEERING (ICCAE 2016), 2016, 56
  • [47] Blockchain-Enabled Federated Learning with Neuromorphic Edge Devices for Drone Identification and Flight Mode Detection
    Henderson, Alex
    Yakopcic, Chris
    Colter, Jamison
    Harbour, Steven
    Taha, Tarek
    2023 IEEE/AIAA 42ND DIGITAL AVIONICS SYSTEMS CONFERENCE, DASC, 2023,
  • [48] Drone Vision
    Greene, Daniel
    SURVEILLANCE & SOCIETY, 2015, 13 (02) : 233 - 249
  • [49] Learning Deep Sensorimotor Policies for Vision-based Autonomous Drone Racing
    Fu, Jiawei
    Song, Yunlong
    Wu, Yan
    Yu, Fisher
    Scaramuzza, Davide
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2023, : 5243 - 5250
  • [50] Testing a Vision-Based Autonomous Drone Navigation Model in a Forest Environment
    Lee, Alvin
    Yong, Suet-Peng
    Pedrycz, Witold
    Watada, Junzo
    ALGORITHMS, 2024, 17 (04)