帮助 关于我们

返回检索结果

一种基于因子图消元优化的激光雷达视觉惯性融合SLAM方法
An Fusion SLAM Method for LiDAR Visual and IMU Based on Factor Map Elimination Optimization

查看参考文献26篇

袁国帅 1   齐咏生 1,2,3 *   刘利强 1,2,3   苏建强 1,2   张丽杰 1,2,3  
文摘 针对单一传感器SLAM(Simultaneous Localization And Mapping)技术在复杂环境中存在精度低、可靠性差等问题,提出一种基于因子图消元优化的激光雷达、视觉和IMU(Inertial Measurement Unit)融合SLAM算法(Multi Factor Graph fusion SLAM with IMU as the Dominant system,ID-MFG-SLAM).首先,采用多因子图模型,提出以IMU为主系统,视觉与激光雷达为辅系统,通过引入辅系统观测因子约束IMU偏差,并接收IMU里程计因子实现运动预测与融合的全新结构.之后,为降低融合后的优化成本,加入滑窗机制并设计基于Householder变换的QR分解消元法将因子图转换为贝叶斯网络.最后,引入一种球面线性插值与线性插值之间的自适应插值算法,将激光雷达点云投影到单位球体上实现视觉特征点深度估计.实验结果表明,相比其他经典算法,该方法在复杂大、小场景中绝对轨迹误差分别可达到约0.68 m和0.24 m,具有更高的精度和可靠性.
其他语种文摘 Addressing the limitations of single-sensor SLAM (Simultaneous Localization And Mapping) techniques, degraded perception, and poor reliability in complex environments, this paper proposes a multi-factor graph fusion SLAM algorithm with IMU as the dominant system (ID-MFG-SLAM). Firstly, the utilization of a multi-factor graph model, with the IMU (Inertial Measurement Unit) as the primary system and visual and LIDAR sensors as secondary systems. This novel structure incorporates observation factors from the secondary systems to constrain IMU biases and integrates IMU odometry factors for motion prediction and fusion. To reduce the optimization cost after fusion, a sliding window mechanism is introduced for historical state information backtracking. Additionally, a QR decomposition elimination method based on Householder transformation is employed to convert the factor graph into a Bayesian network, simplifying the graph's structure and improving computational efficiency. Furthermore, an adaptive interpolation algorithm between quaternion spherical linear interpolation and linear interpolation is introduced. This algorithm projects LIDAR point clouds onto a unit sphere, enabling depth estimation of visual feature points. The experimental results show that compared to other classic algorithms, this method can achieve absolute trajectory errors of about 0.68 m and 0.24 m in complex large and small scenes, respectively, with higher accuracy and reliability.
来源 电子学报 ,2023,51(11):3042-3052 【核心库】
DOI 10.12263/DZXB.20230209
关键词 同时定位与建图 ; 多传感器融合 ; 复杂场景 ; 激光雷达 ; IMU里程计 ; 因子图优化
地址

1. 内蒙古工业大学电力学院, 内蒙古, 呼和浩特, 010080  

2. 内蒙古自治区高等学校智慧能源技术与装备工程研究中心, 内蒙古自治区高等学校智慧能源技术与装备工程研究中心, 内蒙古, 呼和浩特, 010080  

3. 大规模储能技术教育部工程研究中心, 大规模储能技术教育部工程研究中心, 内蒙古, 呼和浩特, 010080

语种 中文
文献类型 研究性论文
ISSN 0372-2112
学科 自动化技术、计算机技术
基金 国家自然科学基金 ;  内蒙古科技攻关项目 ;  内蒙古自然科学基金
文献收藏号 CSCD:7641759

参考文献 共 26 共2页

1.  Bailey T. Simultaneous localization and mapping (SLAM): Part II. IEEE Robotics & Automation Magazine,2006,13(3):108-117 CSCD被引 105    
2.  Zhang J. Low-drift and real-time LiDAR odometry and mapping. Autonomous Robots,2017,41(2):401-416 CSCD被引 78    
3.  Shan T X. LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS),2019:4758-4765 CSCD被引 2    
4.  Mur-Artal R. ORBSLAM: A versatile and accurate monocular SLAM system. IEEE Transactions on Robotics,2015,31(5):1147-1163 CSCD被引 483    
5.  Mur-Artal R. ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Transactions on Robotics,2017,33(5):1255-1262 CSCD被引 461    
6.  Tubman R. Efficient robotic SLAM by fusion of RatSLAM and RGBD-SLAM. 2016 23rd International Conference on Mechatronics and Machine Vision in Practice (M2VIP),2017:1-6 CSCD被引 1    
7.  Liu Z X. Mobile robot positioning method based on multi-sensor information fusion laser SLAM. Cluster Computing,2019,22(2):5055-5061 CSCD被引 4    
8.  Lin J R. R2LIVE: A robust, real-time, LiDAR-inertial-visual tightly-coupled state estimator and mapping. IEEE Robotics and Automation Letters,2021,6(4):7469-7476 CSCD被引 3    
9.  Lin J R. R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package. 2022 International Conference on Robotics and Automation (ICRA),2022:10672-10678 CSCD被引 2    
10.  Shan T X. LIO-SAM: Tightlycoupled lidar inertial odometry via smoothing and mapping. 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS),2021:5135-5142 CSCD被引 2    
11.  Xu W. FAST-LIO: A fast, robust LiDAR-inertial odometry package by tightly-coupled iterated Kalman filter. IEEE Robotics and Automation Letters,2021,6(2):3317-3324 CSCD被引 27    
12.  Xu W. FAST-LIO_2: Fast direct LiDARinertial odometry. IEEE Transactions on Robotics,2022,38(4):2053-2073 CSCD被引 16    
13.  Qin T. VINS-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics,2018,34(4):1004-1020 CSCD被引 216    
14.  Qin T. A general optimization-based framework for local odometry estimation with multiple sensors,2019 CSCD被引 7    
15.  Campos C. ORBSLAM3: An accurate open-source library for visual, visualinertial, and multimap SLAM. IEEE Transactions on Robotics,2021,37(6):1874-1890 CSCD被引 123    
16.  Zhang J. Visual-LiDAR odometry and mapping: Low-drift, robust, and fast. 2015 IEEE International Conference on Robotics and Automation (ICRA),2015:2174-2181 CSCD被引 3    
17.  Shao W Z. Stereo visual inertial LiDAR simultaneous localization and mapping. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS),2020:370-377 CSCD被引 1    
18.  Shan T X. LVI-SAM: Tightlycoupled lidar-visual-inertial odometry via smoothing and mapping. 2021 IEEE International Conference on Robotics and Automation (ICRA),2021:5692-5698 CSCD被引 3    
19.  Yin J E. M2DGR: A multi-sensor and multiscenario SLAM dataset for ground robots. IEEE Robotics and Automation Letters,2022,7(2):2266-2273 CSCD被引 1    
20.  Hsu L T. UrbanNav: An opensourced multisensory dataset for benchmarking positioning algorithms designed for urban areas. Proceedings of the 34th International Technical Meeting of the Satellite Division of the Institute of Navigation. Missouri: Institute of Navigation,2021:226-256 CSCD被引 1    
引证文献 2

1 卢元杰 面向系统工程的无人机自主定位系统研究 图学学报,2024,45(2):363-368
CSCD被引 0 次

2 岑志豪 可重定位的激光视觉惯性室内定位方法 导航定位学报,2024,12(4):165-173
CSCD被引 0 次

显示所有2篇文献

论文科学数据集
PlumX Metrics
相关文献

 作者相关
 关键词相关
 参考文献相关

版权所有 ©2008 中国科学院文献情报中心 制作维护:中国科学院文献情报中心
地址:北京中关村北四环西路33号 邮政编码:100190 联系电话:(010)82627496 E-mail:cscd@mail.las.ac.cn 京ICP备05002861号-4 | 京公网安备11010802043238号