DOI QR코드

DOI QR Code

Onboard dynamic RGB-D simultaneous localization and mapping for mobile robot navigation

  • Canovas, Bruce (Universite Grenoble Alpes, CNRS, Grenoble INP, GIPSA-lab) ;
  • Negre, Amaury (Universite Grenoble Alpes, CNRS, Grenoble INP, GIPSA-lab) ;
  • Rombaut, Michele (Universite Grenoble Alpes, CNRS, Grenoble INP, GIPSA-lab)
  • 투고 : 2021.02.20
  • 심사 : 2021.06.16
  • 발행 : 2021.08.01

초록

Although the actual visual simultaneous localization and mapping (SLAM) algorithms provide highly accurate tracking and mapping, most algorithms are too heavy to run live on embedded devices. In addition, the maps they produce are often unsuitable for path planning. To mitigate these issues, we propose a completely closed-loop online dense RGB-D SLAM algorithm targeting autonomous indoor mobile robot navigation tasks. The proposed algorithm runs live on an NVIDIA Jetson board embedded on a two-wheel differential-drive robot. It exhibits lightweight three-dimensional mapping, room-scale consistency, accurate pose tracking, and robustness to moving objects. Further, we introduce a navigation strategy based on the proposed algorithm. Experimental results demonstrate the robustness of the proposed SLAM algorithm, its computational efficiency, and its benefits for on-the-fly navigation while mapping.

키워드

참고문헌

  1. R. Mur-Artal and J. D. Tardos, ORB-SLAM2: An open-source SLAM system for monocular, stereo and RGB-D cameras, IEEE Trans. Robot. 33 (2017), no. 5, 1255-1262. https://doi.org/10.1109/TRO.2017.2705103
  2. R. A. Newcombe et al., Kinectfusion: Real-time dense surface mapping and tracking, in Proc. IEEE Int. Symp. Mix. Augmented Real. (Basel, Switzerland), Oct. 2011, pp. 127-136.
  3. T. Whelan et al., Elasticfusion: Real-time dense slam and light source estimation, Int. J. Robot. Res. 35 (2016), no. 14, 1697-1716. https://doi.org/10.1177/0278364916669237
  4. B. Canovas et al., Speed and memory efficient dense RGBD slam in dynamic scenes, in Proc. IEEE/RSJ Intl. Conf. Intell. Robot. Syst. (Las Vegas, NV, USA), Jan. 2020, pp. 4996-5001.
  5. B. Canovas et al., A Coarse and Relevant 3D Representation for Fast and Lightweight RGB-D Mapping, in Proc. Int. Conf. Comput. Vis. Theory Appl. (Prague, Czech Republic), Feb. 2019, pp. 824-831.
  6. M. Niessner et al., Real-time 3d reconstruction at scale using voxel hashing, ACM Trans. Graph. 32 (2013), no. 6, 1-11.
  7. K. Wang, F. Gao, and S. Shen, Real-time scalable dense surfel mapping, in Proc. Int. Conf. Robot. Autom. (Montreal, Canada), May 2019, pp. 6919-6925.
  8. F. Endres et al., 3-d mapping with an RGB-D camera, IEEE Trans. Robot. 30 (2014), no. 1, 177-187. https://doi.org/10.1109/TRO.2013.2279412
  9. A. Hornung et al., Octomap: An efficient probabilistic 3d mapping framework based on octrees, Auton. Robots 34 (2013), no. 3, 189-206. https://doi.org/10.1007/s10514-012-9321-0
  10. D. Yang et al., Dre-slam: Dynamic RGB-D encoder slam for a differential-drive robot, Remote Sens. 11 (2019), no. 4, 830. https://doi.org/10.3390/rs11070830
  11. N. Perez-Higueras et al., 3d exploration and navigation with optimal-rrt planners for ground robots in indoor incidents, Sensors 20 (2020), no. 1, 220. https://doi.org/10.3390/s20010220
  12. R. Scona et al., Staticfusion: Background reconstruction for dense RGB-D slam in dynamic environments, in Proc. IEEE Int. Conf. Robot. Autom. (Brisbane, Australia), Sept. 2018, pp. 3849-3856.
  13. T. Zhang et al., Flowfusion: Dynamic dense RGB-D slam based on optical flow, in Proc. IEEE Int. Conf. Robot. Autom. (Paris, France), May 2020, pp. 7322-7328.
  14. M. Runz and L. Agapito, Co-fusion: Real-time segmentation, tracking and fusion of multiple objects, in Proc. IEEE Int. Conf. Robot. Autom. (Singapore), May 2017, pp. 4471-4478.
  15. M. Runz, M. Buffier, and L. Agapito, Maskfusion: Real-time recognition, tracking and reconstruction of multiple moving objects, in Proc. IEEE Int. Symp. Mix. Augmented Real. (Munich, Germany), Oct. 2018, pp. 10-20.
  16. Z. Wang et al., A computationally efficient semantic SLAM solution for dynamic scenes, Remote Sens. 11 (2019), no. 11, 1363. https://doi.org/10.3390/rs11111363
  17. J. Redmon and A. Farhadi, Yolov3: An incremental improvement, arXiv preprint, CoRR, 2018, arXiv: 1804.02767.
  18. S. Yang et al., SGC-VSLAM: A semantic and geometric constraints VSLAM for dynamic indoor environments, Sensors 20 (2020), no. 8, 2432. https://doi.org/10.3390/s20082432
  19. A. Ratter and C. Sammut, Fused 2d/3d position tracking for robust slam on mobile robots, in Proc. IEEE/RSJ Int. Conf. Intell. Robot. Syst. (Hamburg, Germany), Sept. 2015, pp. 1962-1969.
  20. T. Laidlow et al., Dense RGB-D-inertial slam with map deformations, in Proc. IEEE/RSJ Int. Conf. Intell. Robot. Syst. (Vancouver, Canada), Sept. 2017, pp. 6741-6748.
  21. C. Houseago, M. Bloesch, and S. Leutenegger, Ko-fusion: Dense visual slam with tightly-coupled kinematic and odometric tracking, in Proc. Int. Conf. Robot. Autom. (Montreal, Canada), May 2019, pp. 4054-4060.
  22. K. Yamaguchi, D. McAllester, and R. Urtasun, Efficient joint segmentation, occlusion labeling, stereo and flow estimation, in Computer vision-ECCV 2014, Springer, Zurich, Switzerland, 2014, pp. 756-771.
  23. A. Bochkovskiy, C. Wang, and H. M. Liao, Yolov4: Optimal speed and accuracy of object detection, arXiv preprint, CoRR, 2020, arXiv: 2004.10934.
  24. Z. Wang et al., A computationally efficient semantic slam solution for dynamic scenes, Remote Sens. 11 (2019), no. 11, 1363. https://doi.org/10.3390/rs11111363
  25. T. Kroeger et al., Fast optical flow using dense inverse search, in Computer vision-ECCV 2016, Springer, Amsterdam, Netherlands, 2016, pp. 471-488.
  26. J. Huang et al., Optical flow based real-time moving object detection in unconstrained scenes, arXiv preprint, CoRR, 2018, arXiv: 1807.04890.
  27. C. X. Guo, F. M. Mirzaei, and S. I. Roumeliotis, An analytical least-squares solution to the odometer-camera extrinsic calibration problem, in Proc. IEEE Int. Conf. Robot. Autom. (Saint Paul, MN, USA), May 2012, pp. 3962-3968.
  28. M. Aladem and S. Rawashdeh, Lightweight visual odometry for autonomous mobile robots, Sensors 18 (2018), 2837. https://doi.org/10.3390/s18092837
  29. J. Bian et al., Gms: Grid-based motion statistics for fast, ultra-robust feature correspondence, in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (Honolulu, HI, USA), July 2017, pp. 2828-2837.
  30. S. Rusinkiewicz, A symmetric objective function for icp, ACM Trans. Graph. 38 (2019), no. 4, 1-7. https://doi.org/10.1145/3306346.3323037
  31. J. Xie et al., Fine registration of 3d point clouds fusing structural and photometric information using an RGB-D camera, J. Visual Commun. Image Represent. 32 (2015), 194-204. https://doi.org/10.1016/j.jvcir.2015.08.007
  32. B. Glocker et al., Real-time RGB-D camera relocalization via randomized ferns for keyframe encoding, IEEE Trans. Visual. Comput. Graph. 21 (2015), 571-583. https://doi.org/10.1109/TVCG.2014.2360403
  33. M. Grupp, evo: Python package for the evaluation of odometry and slam, 2017, https://github.com/MichaelGrupp/evo