Design of visual inertial state estimator for autonomous systems via multi-sensor fusion approach

被引:2
|
作者
He, Shenghuang [1 ]
Li, Yanzhou [2 ,3 ]
Lu, Yongkang [2 ,3 ]
Liu, Yishan [1 ,2 ,3 ]
机构
[1] Shanghai Jiao Tong Univ, Ningbo Artificial Intelligence Inst, Dept Automat, Shanghai 200240, Peoples R China
[2] Guangdong Univ Technol, Sch Automat, Guangzhou 510006, Peoples R China
[3] Guangdong Univ Technol, Guangdong Prov Key Lab Intelligent Decis & Coopera, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
Autonomous systems; Visual-inertial data fusion; State estimation; Inverse combination optical flow; Nonlinear optimization; KALMAN FILTER; NAVIGATION; ALGORITHM; TRACKING;
D O I
10.1016/j.mechatronics.2023.103066
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The achievement of autonomous navigation in autonomous systems critically hinges on the implementation of robust localization and reliable mapping. A new visual-inertial simultaneous localization and mapping (SLAM) algorithm is proposed in this paper, which consists of a visual-inertial frontend, system backend, loop closure detection module, and initialization module. Firstly, combined the inverse combination optical flow method with image pyramid, the problem of localization failure of autonomous systems due to light sensitivity of vision sensors is addressed. To meet real-time requirements, the computation complexity of algorithm is effectively reduced by combining FAST corner points with threading building block (TBB) programming library. Secondly, based on the fourth-order Runge Kutta (RK), inertial measurement unit (IMU) pre-integration model can effectively improve the estimation accuracy of autonomous systems. Nonlinear optimization backend based on DogLeg, sliding window and marginalization methods, is adopted to reduce computation complexity during backend processing. Thirdly, to mitigate the drawback of accumulating errors leading to large pose error over long periods, a loop closure detection module is introduced, and an initialization module is added to integrate visual and inertial data. Finally, the feasibility and robustness of the system are verified through testing on the Euroc dataset and Evo precision evaluation tool.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] A robot state estimator based on multi-sensor information fusion
    Zhou, Yang
    Ye, Ping
    Liu, Yunhang
    2018 5TH INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2018, : 115 - 119
  • [2] An estimator for multi-sensor data fusion
    Thejaswi, C.
    Ganapathy, V.
    Patro, R. K.
    Raina, M.
    Ghosh, S. K.
    2006 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-6, PROCEEDINGS, 2006, : 2690 - +
  • [3] Calibration of inertial and vision systems as a prelude to multi-sensor fusion
    Randeniya, D. I. B.
    Gunaratne, M.
    Sarkar, S.
    Nazef, A.
    TRANSPORTATION RESEARCH PART C-EMERGING TECHNOLOGIES, 2008, 16 (02) : 255 - 274
  • [4] Multi-sensor data fusion structures in autonomous systems: A review
    Huang, XH
    Wang, M
    PROCEEDINGS OF THE 2003 IEEE INTERNATIONAL SYMPOSIUM ON INTELLIGENT CONTROL, 2003, : 817 - 821
  • [5] Attack and estimator design for multi-sensor systems with undetectable adversary
    Song, Haiyu
    Shi, Peng
    Lim, Cheng-Chew
    Zhang, Wen-An
    Yu, Li
    AUTOMATICA, 2019, 109
  • [6] Multi-sensor Information Fusion Steady-State Kalman Estimator for Systems with System Errors and Sensor Errors
    Li, Yun
    Zhao, Ming
    Hao, Gang
    INTERNATIONAL JOURNAL OF SECURITY AND ITS APPLICATIONS, 2016, 10 (02): : 129 - 140
  • [7] Visual Marker based Multi-Sensor Fusion State Estimation
    Luis Sanchez-Lopez, Jose
    Arellano-Quintana, Victor
    Tognon, Marco
    Campoy, Pascual
    Franchi, Antonio
    IFAC PAPERSONLINE, 2017, 50 (01): : 16003 - 16008
  • [8] Multi-Sensor Fusion for Wheel-Inertial-Visual Systems Using a Fuzzification-Assisted Iterated Error State Kalman Filter
    Huang, Guohao
    Huang, Haibin
    Zhai, Yaning
    Tang, Guohao
    Zhang, Ling
    Gao, Xingyu
    Huang, Yang
    Ge, Guoping
    Sensors, 2024, 24 (23)
  • [9] Multi-Sensor Fusion for Navigation of Autonomous Vehicles
    Soloviev, Andrey
    PROCEEDINGS OF THE 26TH INTERNATIONAL TECHNICAL MEETING OF THE SATELLITE DIVISION OF THE INSTITUTE OF NAVIGATION (ION GNSS 2013), 2013, : 3615 - 3632
  • [10] Distributed fusion estimator for multi-sensor asynchronous sampling systems with missing measurements
    Lin, Honglei
    Sun, Shuli
    IET SIGNAL PROCESSING, 2016, 10 (07) : 724 - 731