Hostname: page-component-6bb9c88b65-spzww Total loading time: 0 Render date: 2025-07-26T06:41:23.286Z Has data issue: false hasContentIssue false

A multi-metric LiDAR Simultaneous Localization and Mapping system based on intensity feature assistance and two-step decoupling estimation

Published online by Cambridge University Press:  22 July 2025

Zhenghui Xu
Affiliation:
School of Intelligent Engineering and Automation, Beijing University of Posts and Telecommunications, Beijing, China
Jian Li*
Affiliation:
School of Intelligent Engineering and Automation, Beijing University of Posts and Telecommunications, Beijing, China
Shimin Wei
Affiliation:
School of Intelligent Engineering and Automation, Beijing University of Posts and Telecommunications, Beijing, China
Ling Tang
Affiliation:
School of Intelligent Engineering and Automation, Beijing University of Posts and Telecommunications, Beijing, China
Huanlong Chen
Affiliation:
Shanghai Institute of Aerospace System Engineering, Shanghai, China
*
Corresponding author: Jian Li; Email: jianli_628@126.com

Abstract

To address the problems of accuracy degradation, localization drift, and even failure of Simultaneous Localization and Mapping (SLAM) algorithms in unstructured environments with sparse geometric features, such as outdoor parks, highways, and urban roads, a multi-metric light detection and ranging (LiDAR) SLAM system based on the fusion of geometric and intensity features is proposed. Firstly, an adaptive method for extracting multiple types of geometric features and salient intensity features is proposed to address the issue of insufficient sparse feature extraction. In addition to extracting traditional edge and planar features, vertex features are also extracted to fully utilize the geometric information, and intensity edge features are extracted in areas with significant intensity changes to increase multi-level perception of the environment. Secondly, in the state estimation, a multi-metric error estimation method based on point-to-point, point-to-line, and point-to-plane is used, and a two-step decoupling strategy is employed to enhance pose estimation accuracy. Finally, qualitative and quantitative experiments on public datasets demonstrate that compared to state-of-the-art pure geometric and intensity-assisted LiDAR SLAM algorithms, our proposed algorithm achieves superior localization accuracy and mapping clarity, with an ATE accuracy improvement of 28.93% and real-time performance of up to 62.9 ms. Additionally, test conducted in real campus environments further validates the effectiveness of our approach in complex, unstructured scenarios.

Information

Type
Research Article
Copyright
© The Author(s), 2025. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Article purchase

Temporarily unavailable

References

Wang, Y., Tian, Y., Chen, J., Xu, K. and Ding, X., “A survey of visual SLAM in dynamic environment: The evolution from geometric to semantic approaches,” IEEE Trans. Instrum. Meas. 73, 121 (2024).Google Scholar
Da, C., Chen, Z., Song, T. and Lu, Y., “Tightly coupled SLAM system for indoor complex scenes,” Robotica, 114 (2025).10.1017/S0263574725000463CrossRefGoogle Scholar
Cadena, C., Carlone, L., Carrillo, H., Latif, Y., Scaramuzza, D., Neira, J., Reid, I. and Leonard, J. J., “Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age,” IEEE Trans. Robot. 32(6), 13091332 (2016).10.1109/TRO.2016.2624754CrossRefGoogle Scholar
Cai, D., Li, R., Hu, Z., Lu, J., Li, S. and Zhao, Y., “A comprehensive overview of core modules in visual SLAM framework,” Neurocomputing 590, 127760 (2024).10.1016/j.neucom.2024.127760CrossRefGoogle Scholar
Liu, Y., Wang, C., Wu, H., Wei, Y., Ren, M., Zhao, C., “Improved LiDAR localization method for mobile robots based on multi-sensing,” Remote Sens. BASEL 14(23), 6133 (2022).10.3390/rs14236133CrossRefGoogle Scholar
Pu, H., Luo, J., Wang, G., Huang, T., Liu, H. and Luo, J., “Visual SLAM integration with semantic segmentation and deep learning: A review,” IEEE Sens. J. 23(19), 2211922138 (2023).10.1109/JSEN.2023.3306371CrossRefGoogle Scholar
Debeunne, C. and Vivet, D., “A review of visual-LiDAR fusion based simultaneous localization and mapping,” Sensors 20(7), 2068 (2020).10.3390/s20072068CrossRefGoogle ScholarPubMed
Zhang, Y., Shi, P. and Li, J., “3D LiDAR SLAM: A survey,” Photogrammetr. Record 39(186), 457517 (2024).10.1111/phor.12497CrossRefGoogle Scholar
Fasiolo, D. T., Scalera, L. and Maset, E., “Comparing LiDAR and IMU-based SLAM approaches for 3D robotic mapping,” Robotica 41(9), 25882604 (2023).10.1017/S026357472300053XCrossRefGoogle Scholar
Huang, X., Mei, G., Zhang, J., Abbas, R., “A comprehensive survey on point cloud registration, arXiv preprint arXiv: 2103.02690 (2021).Google Scholar
Zhang, J., Yao, Y. and Deng, B., “Fast and robust iterative closest point,” IEEE Trans. Pattern Anal. Mach. Intell. 44(7), 34503466 (2021).Google Scholar
Liu, S., Sun, E. and Dong, X., “SLAMB&MAI: A comprehensive methodology for SLAM benchmark and map accuracy improvement,” Robotica 42(4), 10391054 (2024).10.1017/S0263574724000079CrossRefGoogle Scholar
Behley, J. and Stachniss, C., “Efficient surfel-based SLAM using 3D laser range data in urban environments,” In: Robotics: science and systems (2018) p. 59.Google Scholar
Rozenberszki, D. and Majdik, A.L.. “LOL: Lidar-only Odometry and Localization in 3D Point Cloud Maps.” In: 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE (2020).10.1109/ICRA40945.2020.9197450CrossRefGoogle Scholar
Dellenbach, P., Deschaud, J.E., Paris, M., Jacquet, B., Goulette, F.. “CT-ICP: Real-time Elastic LiDAR Odometry with Loop Closure.” In: 2022 International Conference on Robotics and Automation (ICRA). IEEE (2022).10.1109/ICRA46639.2022.9811849CrossRefGoogle Scholar
Biber, P. and Straßer, W., “The Normal Distributions Transform: A New Approach to Laser Scan Matching,” In: Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No. 03CH37453) (IEEE, 2003).Google Scholar
Koide, K., Miura, J. and Menegatti, E., “A portable three-dimensional LIDAR-based system for long-term and wide-area people behavior measurement,” Int. J. Adv. Robot. Syst. 16(2), 1729881419841532 (2019).10.1177/1729881419841532CrossRefGoogle Scholar
Bouraine, S., Bougouffa, A. and Azouaoui, O.. “NDT-PSO, a New NDT based SLAM Approach Using Particle Swarm Optimization.” In: 2020 16th International Conference on Control, Automation, Robotics and Vision (ICARCV). IEEE (2020).Google Scholar
Zhou, Z., Zhao, C., Adolfsson, D., Su, S., Gao, Y., Duckett, T., Sun, L.. “NDT-transformer: Large-scale 3D Point Cloud Localisation Using the Normal Distribution Transform Representation.” In: 2021 IEEE International Conference on Robotics and Automation (ICRA). IEEE (2021).10.1109/ICRA48506.2021.9560932CrossRefGoogle Scholar
Zhang, J. and Singh, S., “LOAM: Lidar odometry and mapping in real-time,” Robot. Sci. Syst. 2(9), 19 (2014).Google Scholar
Shan, T. and Englot, B.. “Lego-loam: Lightweight and Ground-optimized LiDAR Odometry and Mapping on Variable Terrain.” In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE (2018).10.1109/IROS.2018.8594299CrossRefGoogle Scholar
Wang, H., Wang, C., Chen, C., Xie, L.. “F-loam: Fast LiDAR Odometry and Mapping.” In: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE (2021).10.1109/IROS51168.2021.9636655CrossRefGoogle Scholar
Pan, Y., Xiao, P., He, Y., Shao, Z., Li, Z.. “MULLS: Versatile LiDAR SLAM via Multi-metric Linear Least Square.” In: 2021 IEEE International Conference on Robotics and Automation (ICRA). IEEE (2021).10.1109/ICRA48506.2021.9561364CrossRefGoogle Scholar
Greenacre, M., Groenen, P. J. F., Hastie, T., D’Enza, A. I., Markos, A. and Tuzhilina, E., “Principal component analysis,” Nat. Rev. Methods Primers 2(1), 100 (2022).10.1038/s43586-022-00184-wCrossRefGoogle Scholar
Chen, S., Ma, H., Jiang, C., Zhou, B., Xue, W., Xiao, Z. and Li, Q., “NDT-LOAM: A real-time LiDAR odometry and mapping with weighted NDT and LFA,” IEEE Sens. J. 22(4), 36603671 (2021).10.1109/JSEN.2021.3135055CrossRefGoogle Scholar
Xu, M., Lin, S., Wang, J. and Chen, Z., “A LiDAR SLAM system with geometry feature group based stable feature selection and three-stage loop closure optimization,” IEEE Trans. Instrum. Meas. 72, 110 (2023).Google Scholar
Shuaixin, L.I., Jiuren, L.I., Bin, T., Long, C., Li, W., Guangyun, L.I., “A laser SLAM method for unmanned vehicles in point cloud degenerated tunnel environments,” Acta Geodaetica Cartogr. Sin. 50(11), 1487 (2021).Google Scholar
Sato, S., Yao, Y., Yoshida, T., Kaneko, T., Ando, S., Shimamura, J.. “Unsupervised Intrinsic Image Decomposition with LiDAR Intensity.” In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2023).10.1109/CVPR52729.2023.01294CrossRefGoogle Scholar
Park, Y. S., Jang, H. and Kim, A.. “I-LOAM: Intensity Enhanced LiDAR Odometry and Mapping.” In: 2020 17th International Conference on Ubiquitous Robots (UR). IEEE (2020).10.1109/UR49135.2020.9144987CrossRefGoogle Scholar
Wang, H., Wang, C. and Xie, L., “Intensity-slam: Intensity assisted localization and mapping for large scale environment,” IEEE Robot. Autom. Lett. 6(2), 17151721 (2021).10.1109/LRA.2021.3059567CrossRefGoogle Scholar
Li, J., Ji, R., Liang, X., Ge, S. S. and Yan, H., “An intensity-augmented LiDAR-inertial SLAM for solid-state LiDARs in degenerated environments,” IEEE Trans. Instrum. Meas. 71, 110 (2022).Google Scholar
Dai, Z., Zhou, J., Li, T., Yao, H., Sun, S. and Zhu, X., “An intensity-enhanced LiDAR SLAM for unstructured environments,” Meas. Sci. Technol. 34(12), 125120 (2023).10.1088/1361-6501/acf38dCrossRefGoogle Scholar
Zhang, X., Zhang, H., Qian, C., Li, B. and Cao, Y., “A LiDAR-intensity SLAM and loop closure detection method using an intensity cylindrical-projection shape context descriptor,” Int. J. Appl. Earth Observ. Geoinf. 122, 103419 (2023).Google Scholar
Chen, Z., Zhu, H., Yu, B., Jiang, C., Hua, C., Fu, X. and Kuang, X., “IGE-LIO: Intensity gradient enhanced tightly-coupled LiDAR-inertial odometry,” IEEE Trans. Instrum. Meas. 73, 111 (2024).Google Scholar
Guo, J., Borges, P. V. K., Park, C. and Gawel, A., “Local descriptor for robust place recognition using lidar intensity,” IEEE Robot. Autom. Lett. 4(2), 14701477 (2019).10.1109/LRA.2019.2893887CrossRefGoogle Scholar
Yang, H., Shi, J. and Carlone, L., “Teaser: Fast and certifiable point cloud registration,” IEEE Trans. Robot. 37(2), 314333 (2020).10.1109/TRO.2020.3033695CrossRefGoogle Scholar
Lu, W., Zhou, Y., Wan, G., Hou, S., Song, S.. “L3-net: Towards Learning based LiDAR Localization for Autonomous Driving.” In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2019).10.1109/CVPR.2019.00655CrossRefGoogle Scholar
Geiger, A., Lenz, P. and Urtasun, R.. “Are We Ready for Autonomous Driving? The Kitti Vision Benchmark Suite.” In: 2012 IEEE Conference on Computer Vision and Pattern Recognition. IEEE (2012).Google Scholar
Kim, G., Park, Y.S., Cho, Y.H., Jeong, J., Kim, A.. “Mulran: Multimodal Range Dataset for Urban Place Recognition.” In: 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE (2020).10.1109/ICRA40945.2020.9197298CrossRefGoogle Scholar
Wang, H., Wang, C. and Xie, L.. “Intensity Scan Context: Coding Intensity and Geometry Relations for Loop Closure Detection.” In: 2020 IEEE International Conference on Robotics and Automation (ICRA). IEEE (2020).10.1109/ICRA40945.2020.9196764CrossRefGoogle Scholar