FlyView: a bio-informed optical flow truth dataset for visual navigation using panoramic stereo vision

被引:0
|
作者
Leroy, Alix [1 ]
Taylor, Graham K. [1 ]
机构
[1] Univ Oxford, Dept Biol, Oxford Flight Grp, Oxford OX1 3SZ, England
基金
欧洲研究理事会;
关键词
CALIBRATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Flying at speed through complex environments is a difficult task that has been performed successfully by insects since the Carboniferous [1], but remains a challenge for robotic and autonomous systems. Insects navigate the world using optical flow sensed by their compound eyes, which they process using a deep neural network implemented on hardware weighing just a few milligrams. Deploying an insect-inspired network architecture in computer vision could therefore enable more efficient and effective ways of estimating structure and self-motion using optical flow. Training a bio-informed deep network to implement these tasks requires biologically relevant training, test, and validation data. To this end, we introduce FlyView1, a novel bio-informed truth dataset for visual navigation. This simulated dataset is rendered using open source 3D scenes in which the agent's position is known at every frame, and is accompanied by truth data on depth, self-motion, and motion flow. This dataset comprising 42,475 frames has several key features that are missing from existing optical flow datasets, including: (i) panoramic camera images, with a monocular and binocular field of view matched to that of a fly's compound eyes; (ii) dynamically meaningful self-motion, modelled on motion primitives or the 3D trajectories of drones and flies; and (iii) complex natural and indoor environments, including reflective surfaces, fog, and clouds.
引用
收藏
页数:15
相关论文
共 18 条
  • [1] Precise visual navigation using multi-stereo vision and landmark matching
    Zhu, Zhiwei
    Oskiper, Taragay
    Samarasekera, Supun
    Kumar, Rakesh
    UNMANNED SYSTEMS TECHNOLOGY IX, 2007, 6561
  • [2] Visual Navigation for UAV Using Optical Flow Estimation
    Huang Lan
    Song Jian Mei
    Chen Pu Hua
    Cai Gao Hua
    2014 33RD CHINESE CONTROL CONFERENCE (CCC), 2014, : 816 - 821
  • [3] Using panoramic images and optical flow to avoid obstacles in mobile robot navigation
    Soria, Carlos M.
    Carelli, Ricardo
    Sarcinelli-Filho, Mario
    2006 IEEE INTERNATIONAL SYMPOSIUM ON INDUSTRIAL ELECTRONICS, VOLS 1-7, 2006, : 2902 - +
  • [4] Visual navigation of mobile robot using optical flow and visual potential field
    Ohnishi, Naoya
    Imiya, Atsushi
    ROBOT VISION, PROCEEDINGS, 2008, 4931 : 412 - +
  • [5] An embedded optical flow processor for visual navigation using optical correlator technology
    Tchernykh, Valerij
    Beck, Martin
    Janschek, Klaus
    2006 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-12, 2006, : 67 - +
  • [6] Integration of stereo-vision and optical flow using Markov random fields
    Clifford, Sandra P.
    Nasrabadi, Nasser M.
    Neural Networks, 1988, 1 (1 SUPPL)
  • [7] Visual Navigation Using Sparse Optical Flow and Time-to-Transit
    Boretti, Chiara
    Bich, Philippe
    Zhang, Yanyu
    Baillieul, John
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 9397 - 9403
  • [8] Visual navigation using sparse optical flow and time-to-transit
    Boretti, Chiara
    Bich, Philippe
    Zhang, Yanyu
    Baillieul, John
    arXiv, 2021,
  • [9] INTEGRATION OF STEREO VISION AND OPTICAL-FLOW BY USING AN ENERGY-MINIMIZATION APPROACH
    NASRABADI, NM
    CLIFFORD, SP
    LIU, Y
    JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 1989, 6 (06): : 900 - 907
  • [10] Visual Inertial Navigation for a Small UAV Using Sparse and Dense Optical Flow
    Fanin, Fausto
    Hong, Ju-Hyeon
    2019 INTERNATIONAL WORKSHOP ON RESEARCH, EDUCATION AND DEVELOPMENT OF UNMANNED AERIAL SYSTEMS (RED UAS 2019), 2019, : 206 - 212