Visual odometry of Rhinecanthus aculeatus depends on the visual density of the environment

被引:5
|
作者
Karlsson, Cecilia [1 ]
Willis, Jay [1 ]
Patel, Matishalin [1 ]
de Perera, Theresa Burt [1 ]
机构
[1] Dept Zool, Res & Adm Bldg,11a Mansfield Rd, Oxford OX1 3SZ, England
基金
英国生物技术与生命科学研究理事会;
关键词
USE OPTIC FLOW; OPTOMOTOR RESPONSE; ANT ODOMETER; DISTANCE; HONEYBEES; NAVIGATION; FLIGHT;
D O I
10.1038/s42003-022-03925-5
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Distance travelled is a crucial metric that underpins an animal's ability to navigate in the short-range. While there is extensive research on how terrestrial animals measure travel distance, it is unknown how animals navigating in aquatic environments estimate this metric. A common method used by land animals is to measure optic flow, where the speed of self-induced visual motion is integrated over the course of a journey. Whether freely-swimming aquatic animals also measure distance relative to a visual frame of reference is unclear. Using the marine fish Rhinecanthus aculeatus, we show that teleost fish can use visual motion information to estimate distance travelled. However, the underlying mechanism differs fundamentally from previously studied terrestrial animals. Humans and terrestrial invertebrates measure the total angular motion of visual features for odometry, a mechanism which does not vary with visual density. In contrast, the visual odometer used by Rhinecanthus acuelatus is strongly dependent on the visual density of the environment. Odometry in fish may therefore be mediated by a movement detection mechanism akin to the system underlying the optomotor response, a separate motion-detection mechanism used by both vertebrates and invertebrates for course and gaze stabilisation. A freely-swimming teleost fish reproduces a previously learned distance via visual motion information, demonstrating that its visual-motion processing is dependent on visual density.
引用
收藏
页数:7
相关论文
共 50 条
  • [21] Illumination Robust Monocular Direct Visual Odometry for Outdoor Environment Mapping
    Wu, Xiaolong
    Pradalier, Cedric
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 2392 - 2398
  • [22] Visual Odometry With Point and Line Features Based on Underground Tunnel Environment
    Wu, Di
    Wang, Mingen
    Li, Qin
    Xu, Weiping
    Zhang, Taihua
    Ma, Zhihao
    IEEE ACCESS, 2023, 11 : 24003 - 24015
  • [23] Spatiotemporal visual odometry using ground plane in dynamic indoor environment
    He, Zaixing
    Yang, Qinfeng
    Zhao, Xinyue
    Zhang, Shuyou
    Tan, Jianrong
    OPTIK, 2020, 220
  • [24] Visual Odometry Based on Feature Points of Instance Segmentation in Indoor Environment
    Kuang, Jia
    Cheng, Lei
    Cai, Zhihao
    Xiang, Honglian
    2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 2017 - 2023
  • [25] Unsupervised Visual Odometry and Action Integration for PointGoal Navigation in Indoor Environment
    Cao, Yijun
    Zhang, Xian-Shi
    Luo, Fuya
    Lin, Chuan
    Li, Yong-Jie
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (10) : 6173 - 6184
  • [26] Analysis of two visual odometry systems for use in an agricultural field environment
    Ericson, Stefan K.
    Astrand, Bjorn S.
    BIOSYSTEMS ENGINEERING, 2018, 166 : 116 - 125
  • [27] On the Comparison of Mono Visual Odometry Front End in Low Texture Environment
    Wang, Xianyu
    Li, Qimin
    Lin, Zhiwei
    2020 IEEE 3RD INTERNATIONAL CONFERENCE ON MECHATRONICS, ROBOTICS AND AUTOMATION (ICMRA 2020), 2020, : 195 - 200
  • [28] Deep Direct Visual Odometry
    Zhao, Chaoqiang
    Tang, Yang
    Sun, Qiyu
    Vasilakos, Athanasios V.
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (07) : 7733 - 7742
  • [29] Equivariant Visual Odometry in the Wild
    Mahony, Robert
    van Goor, Pieter
    Henein, Mina
    Pike, Ryan
    Zhang, Jun
    Ng, Yonhon
    2020 59TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2020, : 1314 - 1319
  • [30] Bias Compensation in Visual Odometry
    Dubbelman, Gijs
    Hansen, Peter
    Browning, Brett
    2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2012, : 2828 - 2835