UKF-Based IMU–LiDAR Sensor Fusion Method for Robot Navigation in Feature-Minimal Indoor Environments
##plugins.themes.bootstrap3.article.main##
Abstract
This paper proposes an Unscented Kalman Filter (UKF)-based IMU–LiDAR sensor fusion method for autonomous robot navigation in feature-poor enclosed spaces (homogeneous corridors, plain walls, changing lighting), conditions that often weaken LiDAR scan matching and cause large drifts in pure IMU integration. The proposed architecture models motion dynamics in error-state with online-estimated gyroscope/accelerometer biases, while LiDAR measurements are extracted as scan-to-submap constraints that are condensed into adaptive uncertainty relative pose observations. Time synchronization and extrinsic IMU–LiDAR calibration are performed on-the-fly using weak priors to maintain system stability despite time offsets. Evaluation on three indoor scenarios (a 40 m corridor, a 60 m L-aisle, and a 30 × 20 m warehouse with minimal texture) shows a 42–58% reduction in positional RMSE compared to LiDAR-only ICP and 73–81% compared to IMU-only, with translational drift < 0.6% of the distance traveled and heading drift < 0.35°/min. The system runs in real-time at 20–30 Hz on a mid-range CPU, maintaining a 100% localization success rate with no tracking failures at velocities of 0.4–1.2 m/s. These results confirm that UKF with adaptive uncertainty modeling and bias estimation is capable of integrating LiDAR inertial and geometric forces to produce accurate and robust state estimation in feature-poor indoor environments, while providing an efficient foundation for advanced trajectory planning and motion control.
##plugins.themes.bootstrap3.article.details##
[2] Julier, S. J., & Uhlmann, J. K. (2004). Unscented filtering and nonlinear estimation. Proceedings of the IEEE, 92(3), 401–422.
[3] Besl, P. J., & McKay, N. D. (1992). A method for registration of 3-D shapes. IEEE TPAMI, 14(2), 239–256.
[4] Biber, P., & Straßer, W. (2003). The Normal Distributions Transform: A new approach to laser scan matching. IROS 2003.
[5] Zhang, J., & Singh, S. (2014). LOAM: Lidar Odometry and Mapping in Real-time. RSS 2014.
[6] Hess, W., Kohler, D., Rapp, H., & Andor, D. (2016). Real-time loop closure in 2D LIDAR SLAM. ICRA 2016 (Google Cartographer).
[7] Shan, T., Englot, B., et al. (2020). LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping. IROS 2020.
[8] Xu, W., Zhang, F., et al. (2021). FAST-LIO: A fast, robust LiDAR-inertial odometry package. RAS, 144, 103815.
[9] Xu, W., Cai, Y., He, D., Zhang, F. (2022). FAST-LIO2: Fast direct LiDAR-inertial odometry. IEEE Transactions on Robotics, 38(4), 2053–2073.
[10] Geneva, P., Eckenhoff, K., Yang, Y., & Huang, G. (2020). LIO-MSCKF: Tightly-coupled multi-state constraint Kalman filter for LiDAR-inertial. ICRA 2020.
[11] Shan, T., & Englot, B. (2018). LeGO-LOAM: Lightweight and ground-optimized LOAM. IROS 2018.
[12] Forster, C., Carlone, L., Dellaert, F., & Scaramuzza, D. (2017). On-manifold preintegration for visual-inertial odometry. The International Journal of Robotics Research, 36(8), 845–868.
[13] Qin, T., Li, P., & Shen, S. (2018). VINS-Mono: A robust and versatile monocular visual-inertial system. IEEE Transactions on Robotics, 34(4), 1004–1020.
[14] Censi, A. (2007). An ICP variant uses a point-to-line metric. ICRA 2007.
[15] Merry, R., & Verhaegen, M. (2008). Sensor fusion for mobile robot localization using Kalman filtering. Control Engineering Practice, 16(3), 301–315.