Point-line feature fusion based field real-time RGB-D SLAM

被引:9
|
作者
Li, Qingyu [1 ]
Wang, Xin [2 ]
Wu, Tian [1 ]
Yang, Huijun [1 ,3 ,4 ]
机构
[1] Northwest A&F Univ, Coll Informat Engn, Yangling 712100, Peoples R China
[2] Northwest A&F Univ, Coll Language & Culture Northwest, Yangling 712100, Peoples R China
[3] Minist Agr & Rural Affairs, Key Lab Agr Internet Things, Yangling 712100, Peoples R China
[4] Shaanxi Key Lab Agr Informat Percept Intelligent S, Yangling 712100, Peoples R China
来源
COMPUTERS & GRAPHICS-UK | 2022年 / 107卷
关键词
Visual SLAM; RGB-D camera; Point -line feature; Field scene reconstruction; MONOCULAR SLAM;
D O I
10.1016/j.cag.2022.06.013
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
3D reconstruction of crops is important for researching their biological properties, canopy light distribution and robotic harvesting. However, the complex field environment makes the real-time 3D reconstruction of crops difficult. Due to the low-textured in the field, it is difficult to obtain effective features to construct accurate and real-time 3D maps of the field from existing single-feature SLAM methods. In this paper, we propose a novel RGB-D SLAM based on point-line feature fusion for the real-time field 3D scene reconstruction. By optimizing the point-line features joint poses, we first build a 3D scene map of the field based on the point-line feature structure. Then, a joint point cloud filtering method is designed based on the keyframes optimization of the point-line feature. Finally, we obtain the consistently high-quality dense map in the global respect. The overall performance in terms of pose estimation and reconstruction is evaluated on public benchmarks and shows improved performance compared to state-of-the-art methods. Qualitative experiments on the field scenes show that our method enables real-time 3D reconstruction of crops with high robustness. (c) 2022 Published by Elsevier Ltd.
引用
收藏
页码:10 / 19
页数:10
相关论文
共 50 条
  • [1] SLAM Algorithm with Point-Line Feature Fusion Based on RGB-D Camera
    Ma, Li
    Xu, Mengcong
    Zhou, Lei
    [J]. Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 2022, 50 (02): : 76 - 83
  • [2] Real-Time Visual-Inertial Odometry Based on Point-Line Feature Fusion
    Yang G.
    Meng W.D.
    Hou G.D.
    Feng N.N.
    [J]. Gyroscopy and Navigation, 2023, 14 (4) : 339 - 352
  • [3] UPLP-SLAM: Unified point-line-plane feature fusion for RGB-D visual SLAM
    Yang, Haozhi
    Yuan, Jing
    Gao, Yuanxi
    Sun, Xingyu
    Zhang, Xuebo
    [J]. INFORMATION FUSION, 2023, 96 : 51 - 65
  • [4] SlamDunk: Affordable Real-Time RGB-D SLAM
    Fioraio, Nicola
    Di Stefano, Luigi
    [J]. COMPUTER VISION - ECCV 2014 WORKSHOPS, PT I, 2015, 8925 : 401 - 414
  • [5] Real-time large-scale dense RGB-D SLAM with volumetric fusion
    Whelan, Thomas
    Kaess, Michael
    Johannsson, Hordur
    Fallon, Maurice
    Leonard, John J.
    McDonald, John
    [J]. INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2015, 34 (4-5): : 598 - 626
  • [6] Real-time monocular visual-inertial SLAM with structural constraints of line and point-line fusion
    Wang, Shaoshao
    Zhang, Aihua
    Zhang, Zhiqiang
    Zhao, Xudong
    [J]. INTELLIGENT SERVICE ROBOTICS, 2024, 17 (02) : 135 - 154
  • [7] Real-time motion removal based on point correlations for RGB-D SLAM in indoor dynamic environments
    Kesai Wang
    Xifan Yao
    Nanfeng Ma
    Xuan Jing
    [J]. Neural Computing and Applications, 2023, 35 : 8707 - 8722
  • [8] Real-time motion removal based on point correlations for RGB-D SLAM in indoor dynamic environments
    Wang, Kesai
    Yao, Xifan
    Ma, Nanfeng
    Jing, Xuan
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (12): : 8707 - 8722
  • [9] RGB-D Sensor Based Real-time 6DoF-SLAM
    Chen, Hsi-Yuan
    Lin, Chyi-Yeu
    [J]. 2014 INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND INTELLIGENT SYSTEMS (ARIS 2014), 2014, : 61 - 65
  • [10] GPU-Based Real-Time RGB-D 3D SLAM
    Lee, Donghwa
    Kim, Hyongjin
    Myung, Hyun
    [J]. 2012 9TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAL), 2012, : 46 - 48