Differential geometric approach for visual navigation in indoor scenes

被引:0
|
作者
Fuentes, L. [1 ]
Gonzalo-Tasis, M. [1 ]
Bermudez, G. [1 ]
Finat, J. [1 ]
机构
[1] Univ Valladolid, MOBIVA Grp, Valladolid, Spain
关键词
visually navigation; motion analysis; matching correspondence and flow; Kalman filtering;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Visual perception of the environment provides a detailed scene representation which contributes to improve motion planning and obstacle avoidance navigation for wheelchairs in non-structured indoor scenes. In this work we develop a mobile representation of the scene based on perspective maps for the automatic navigation in absence of previous information about the scene. Images are captured with a passive low-cost video camera. The main feature for visual navigation in this work is a map of quadrilaterals with apparent motion. From this mobile map, perspective maps are updated following hierarchical grouping in quadrilaterals maps given by pencils of perspective lines through vanishing points. Egomotion is interpreted in terms of maps of mobile quadrilaterals. The main contributions of this paper are the introduction of Lie expansion/contraction operators for quadrilateral/cuboid and the adaptation of Kalman filtering for moving quadrilaterals to estimate and predict the egomotion of a mobile platform. Our approach is enough modular and flexible for adapting to indoor and outdoor scenes provided at least four homologue cuboids be present in the scene between each pair of sampled views of a video sequence.
引用
收藏
页码:468 / +
页数:2
相关论文
共 50 条
  • [1] Bayesian geometric modeling of indoor scenes
    Del Pero, Luca
    Bowdish, Joshua
    Fried, Daniel
    Kermgard, Bonnie
    Hartley, Emily
    Barnard, Kobus
    2012 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2012, : 2719 - 2726
  • [2] Towards Target-Driven Visual Navigation in Indoor Scenes via Generative Imitation Learning
    Wu, Qiaoyun
    Gong, Xiaoxi
    Xu, Kai
    Manocha, Dinesh
    Dong, Jingxuan
    Wang, Jun
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (01) : 175 - 182
  • [3] Target-driven visual navigation in indoor scenes using reinforcement learning and imitation learning
    Fang, Qiang
    Xu, Xin
    Wang, Xitong
    Zeng, Yujun
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2022, 7 (02) : 167 - 176
  • [4] A practical method for robot navigation in variable indoor scenes
    Wu, Xiongwei
    Wang, Jiahao
    Xu, Wanqing
    Lou, Jin
    Zhang, Renyuan
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 2354 - 2359
  • [5] DISCOMAN: Dataset of Indoor SCenes for Odometry, Mapping And Navigation
    Kirsanov, Pavel
    Gaskarov, Airat
    Konokhov, Filipp
    Sofiiuk, Konstantin
    Vorontsova, Anna
    Slinko, Igor
    Zhukov, Dmitry
    Bykov, Sergey
    Barinova, Olga
    Konushin, Anion
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 2470 - 2477
  • [6] SceneCut: Joint Geometric and Object Segmentation for Indoor Scenes
    Pham, Trung T.
    Thanh-Toan Do
    Sunderhauf, Niko
    Reid, Ian
    2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2018, : 3213 - 3220
  • [7] Visual Nouns for Indoor/Outdoor Navigation
    Molina, Edgardo
    Zhu, Zhigang
    Tian, Yingli
    COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS, PT II, 2012, 7383 : 33 - 40
  • [8] Manhattan constraint-based multiple feature visual SLAM approach for indoor scenes
    Sun X.
    Gong G.
    Chen M.
    Cheng H.
    Guo X.
    Zhongguo Guanxing Jishu Xuebao/Journal of Chinese Inertial Technology, 2023, 31 (09): : 890 - 899
  • [9] Multi goals and multi scenes visual mapless navigation in indoor using meta-learning and scene priors
    Li, Fei
    Guo, Chi
    Luo, Binhan
    Zhang, Huyin
    NEUROCOMPUTING, 2021, 449 : 368 - 377
  • [10] Automated Visual Classification of Indoor Scenes and Architectural Styles
    Solmaz, Berkan
    Yucesoy, Veysel
    Koc, Aykut
    2017 25TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2017,