Browse > Article
http://dx.doi.org/10.7746/jkros.2021.16.3.179

Intensity and Ambient Enhanced Lidar-Inertial SLAM for Unstructured Construction Environment  

Jung, Minwoo (Dept. of Civil and Environmental Engineering, KAIST)
Jung, Sangwoo (Dept. of Civil and Environmental Engineering, KAIST)
Jang, Hyesu (Dept. of Civil and Environmental Engineering, KAIST)
Kim, Ayoung (Dept. of Civil and Environmental Engineering, KAIST)
Publication Information
The Journal of Korea Robotics Society / v.16, no.3, 2021 , pp. 179-188 More about this Journal
Abstract
Construction monitoring is one of the key modules in smart construction. Unlike structured urban environment, construction site mapping is challenging due to the characteristics of an unstructured environment. For example, irregular feature points and matching prohibit creating a map for management. To tackle this issue, we propose a system for data acquisition in unstructured environment and a framework for Intensity and Ambient Enhanced Lidar Inertial Odometry via Smoothing and Mapping, IA-LIO-SAM, that achieves highly accurate robot trajectories and mapping. IA-LIO-SAM utilizes a factor graph same as Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping (LIO-SAM). Enhancing the existing LIO-SAM, IA-LIO-SAM leverages point's intensity and ambient value to remove unnecessary feature points. These additional values also perform as a new factor of the K-Nearest Neighbor algorithm (KNN), allowing accurate comparisons between stored points and scanned points. The performance was verified in three different environments and compared with LIO-SAM.
Keywords
Intensity; Ambient; IA-LIO-SAM; LiDAR Odometry; SLAM;
Citations & Related Records
연도 인용수 순위
  • Reference
1 C. Qin, H. Ye, C. E. Pranata, J. Han, S. Zhang, and M. Liu, "LINS: A Lidar-Inertial State Estimator for Robust and Efficient Navigation," 2020 IEEE International Conference on Robotics and Automation (ICRA), pp. 8899-8906, 2020, DOI: 10.1109/ICRA40945.2020.9197567.   DOI
2 H. Ye, Y. Chen, and M. Liu, "Tightly Coupled 3D Lidar Inertial Odometry and Mapping," 2019 International Conference on Robotics and Automation (ICRA), pp. 3144-3150, 2019, DOI:10.1109/ICRA.2019.8793511.   DOI
3 T.-M. Nguyen, M. Cao, S. Yuan, Y. Lyu, T. H. Nguyen, and L. Xie, "LIRO: Tightly coupled lidar-inertia-ranging odometry," IEEE Int. Conf. Robot. Automat., 2020, [Online], https://arxiv.org/abs/2010.13072.
4 C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza, "OnManifold Preintegration for Real-Time Visual-Inertial Odometry," IEEE Transactions on Robotics, vol. 33, no. 1, pp. 1-21, Feb., 2017, DOI: 10.1109/TRO.2016.2597321.   DOI
5 T. Shan, B. Englot, F. Duarte, C. Ratti, and R. Daniela, "Robust Place Recognition using an Imaging Lidar," IEEE International Conference on Robotics and Automation (ICRA), 2021, [Online], https://arxiv.org/abs/2103.02111v2.
6 X. Chen, T. Labe, A. Milioto, T. Rohling, O. Vysotska, A. Haag, J. Behley, and C. Stachniss, "OverlapNet: Loop Closing for LiDAR-based SLAM," Robotics: Science and Systems (RSS), 2020, [Online], https://arxiv.org/abs/2105.11344v1.
7 Y. S. Park, H. Jang, and A. Kim, "I-LOAM: Intensity Enhanced LiDAR Odometry and Mapping," 2020 17th International Conference on Ubiquitous Robots (UR), Kyoto, Japan, pp. 455-458, 2020, DOI: 10.1109/UR49135.2020.9144987.   DOI
8 J. Jeong, Y. Cho, Y.-S. Shin, H. Roh, and A. Kim, "Complex Urban Dataset with Multi-level Sensors from Highly Diverse Urban Environments," International Journal of Robotics Research, vol. 38, no. 6, pp. 642-657, 2019, DOI: 10.1177/0278364919843996.   DOI
9 P. J. Besl and N. D. McKay, "A Method for Registration of 3D Shapes," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 2, pp. 239-256, 1992, DOI: 10.1109/34.121791.   DOI
10 T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, "LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping," 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 2020, DOI: 10.1109/IROS45743.2020.9341176.   DOI
11 A. Segal, D. Haehnel, and S. Thrun, "Generalized-ICP," Robotics: Science and Systems, Seattle, USA, 2009, DOI: 10.15607/RSS.2009.
12 D. Chetverikov, D. Svirko, D. Stepanov, and P. Krsek, "The Trimmed Iterative Closest Point algorithm," Object recognition supported by user interaction for service robots, Quebec City, QC, Canada, 2002, DOI: 10.1109/ICPR.2002.1047997.
13 J.-M. Lee and G.-W. Kim, "A Camera Pose Estimation Method for Rectangle Feature based Visual SLAM," Journal of Korea Robotics Society, vol. 11, no. 1, Mar., 2016, DOI: 10.7746/jkros.2016.11.1.033.   DOI
14 D. G. Lowe, "Distinctive image features from scale invariant keypoints," International Journal of Computer Vision, vol. 60, no. 2, pp. 91-110, 2004, DOI: 10.1023/B:VISI.0000029664.99615.94.   DOI
15 T. Shan and B. Englot, "LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain," 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4758-4765, 2018, DOI: 10.1109/IROS.2018.8594299.   DOI
16 J. H. Lee, G. Zhang, and I. H. Suh, "Motion Estimation Using 3-D Straight Lines," Journal of Korea Robotics Society, vol. 11, no. 4, Dec., 2016, DOI: 10.7746/jkros.2016.11.4.300.   DOI
17 H. Wang, C. Wang, and L. Xie, "Intensity Scan Context: Coding Intensity and Geometry Relations for Loop Closure Detection," 2020 IEEE International Conference on Robotics and Automation (ICRA), pp. 2095-2101, 2020, DOI: 10.1109/ICRA40945.2020.9196764.   DOI
18 H. Bay, A. Ess, T. Tuytelaars, and L. Van Gool, "Speeded-up robust features (SURF)," Computer Vision and Image Understanding, vol. 110, no. 3, pp. 346-359, 2008, DOI: 10.1016/j.cviu.2007.09.014.   DOI
19 E. Rublee, V. Rabaud, K. Konolige, and G. Bradski, "ORB: An efficient alternative to SIFT or SURF," 2011 International Conference on Computer Vision, pp. 2564-2571, 2011, DOI: 10.1109/ICCV.2011.6126544.   DOI
20 J. Zhang and S. Singh, "Low-drift and Real-time Lidar Odometry and Mapping," Autonomous Robots, vol. 41, pp. 401-416, 2017, DOI: 10.1007/s10514-016-9548-2.   DOI