References
- R. Mur-Artal and J. D. Tardos, ORB-SLAM2: An open-source SLAM system for monocular, stereo and RGB-D cameras, IEEE Trans. Robot. 33 (2017), no. 5, 1255-1262. https://doi.org/10.1109/TRO.2017.2705103
- R. A. Newcombe et al., Kinectfusion: Real-time dense surface mapping and tracking, in Proc. IEEE Int. Symp. Mix. Augmented Real. (Basel, Switzerland), Oct. 2011, pp. 127-136.
- T. Whelan et al., Elasticfusion: Real-time dense slam and light source estimation, Int. J. Robot. Res. 35 (2016), no. 14, 1697-1716. https://doi.org/10.1177/0278364916669237
- B. Canovas et al., Speed and memory efficient dense RGBD slam in dynamic scenes, in Proc. IEEE/RSJ Intl. Conf. Intell. Robot. Syst. (Las Vegas, NV, USA), Jan. 2020, pp. 4996-5001.
- B. Canovas et al., A Coarse and Relevant 3D Representation for Fast and Lightweight RGB-D Mapping, in Proc. Int. Conf. Comput. Vis. Theory Appl. (Prague, Czech Republic), Feb. 2019, pp. 824-831.
- M. Niessner et al., Real-time 3d reconstruction at scale using voxel hashing, ACM Trans. Graph. 32 (2013), no. 6, 1-11.
- K. Wang, F. Gao, and S. Shen, Real-time scalable dense surfel mapping, in Proc. Int. Conf. Robot. Autom. (Montreal, Canada), May 2019, pp. 6919-6925.
- F. Endres et al., 3-d mapping with an RGB-D camera, IEEE Trans. Robot. 30 (2014), no. 1, 177-187. https://doi.org/10.1109/TRO.2013.2279412
- A. Hornung et al., Octomap: An efficient probabilistic 3d mapping framework based on octrees, Auton. Robots 34 (2013), no. 3, 189-206. https://doi.org/10.1007/s10514-012-9321-0
- D. Yang et al., Dre-slam: Dynamic RGB-D encoder slam for a differential-drive robot, Remote Sens. 11 (2019), no. 4, 830. https://doi.org/10.3390/rs11070830
- N. Perez-Higueras et al., 3d exploration and navigation with optimal-rrt planners for ground robots in indoor incidents, Sensors 20 (2020), no. 1, 220. https://doi.org/10.3390/s20010220
- R. Scona et al., Staticfusion: Background reconstruction for dense RGB-D slam in dynamic environments, in Proc. IEEE Int. Conf. Robot. Autom. (Brisbane, Australia), Sept. 2018, pp. 3849-3856.
- T. Zhang et al., Flowfusion: Dynamic dense RGB-D slam based on optical flow, in Proc. IEEE Int. Conf. Robot. Autom. (Paris, France), May 2020, pp. 7322-7328.
- M. Runz and L. Agapito, Co-fusion: Real-time segmentation, tracking and fusion of multiple objects, in Proc. IEEE Int. Conf. Robot. Autom. (Singapore), May 2017, pp. 4471-4478.
- M. Runz, M. Buffier, and L. Agapito, Maskfusion: Real-time recognition, tracking and reconstruction of multiple moving objects, in Proc. IEEE Int. Symp. Mix. Augmented Real. (Munich, Germany), Oct. 2018, pp. 10-20.
- Z. Wang et al., A computationally efficient semantic SLAM solution for dynamic scenes, Remote Sens. 11 (2019), no. 11, 1363. https://doi.org/10.3390/rs11111363
- J. Redmon and A. Farhadi, Yolov3: An incremental improvement, arXiv preprint, CoRR, 2018, arXiv: 1804.02767.
- S. Yang et al., SGC-VSLAM: A semantic and geometric constraints VSLAM for dynamic indoor environments, Sensors 20 (2020), no. 8, 2432. https://doi.org/10.3390/s20082432
- A. Ratter and C. Sammut, Fused 2d/3d position tracking for robust slam on mobile robots, in Proc. IEEE/RSJ Int. Conf. Intell. Robot. Syst. (Hamburg, Germany), Sept. 2015, pp. 1962-1969.
- T. Laidlow et al., Dense RGB-D-inertial slam with map deformations, in Proc. IEEE/RSJ Int. Conf. Intell. Robot. Syst. (Vancouver, Canada), Sept. 2017, pp. 6741-6748.
- C. Houseago, M. Bloesch, and S. Leutenegger, Ko-fusion: Dense visual slam with tightly-coupled kinematic and odometric tracking, in Proc. Int. Conf. Robot. Autom. (Montreal, Canada), May 2019, pp. 4054-4060.
- K. Yamaguchi, D. McAllester, and R. Urtasun, Efficient joint segmentation, occlusion labeling, stereo and flow estimation, in Computer vision-ECCV 2014, Springer, Zurich, Switzerland, 2014, pp. 756-771.
- A. Bochkovskiy, C. Wang, and H. M. Liao, Yolov4: Optimal speed and accuracy of object detection, arXiv preprint, CoRR, 2020, arXiv: 2004.10934.
- Z. Wang et al., A computationally efficient semantic slam solution for dynamic scenes, Remote Sens. 11 (2019), no. 11, 1363. https://doi.org/10.3390/rs11111363
- T. Kroeger et al., Fast optical flow using dense inverse search, in Computer vision-ECCV 2016, Springer, Amsterdam, Netherlands, 2016, pp. 471-488.
- J. Huang et al., Optical flow based real-time moving object detection in unconstrained scenes, arXiv preprint, CoRR, 2018, arXiv: 1807.04890.
- C. X. Guo, F. M. Mirzaei, and S. I. Roumeliotis, An analytical least-squares solution to the odometer-camera extrinsic calibration problem, in Proc. IEEE Int. Conf. Robot. Autom. (Saint Paul, MN, USA), May 2012, pp. 3962-3968.
- M. Aladem and S. Rawashdeh, Lightweight visual odometry for autonomous mobile robots, Sensors 18 (2018), 2837. https://doi.org/10.3390/s18092837
- J. Bian et al., Gms: Grid-based motion statistics for fast, ultra-robust feature correspondence, in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (Honolulu, HI, USA), July 2017, pp. 2828-2837.
- S. Rusinkiewicz, A symmetric objective function for icp, ACM Trans. Graph. 38 (2019), no. 4, 1-7. https://doi.org/10.1145/3306346.3323037
- J. Xie et al., Fine registration of 3d point clouds fusing structural and photometric information using an RGB-D camera, J. Visual Commun. Image Represent. 32 (2015), 194-204. https://doi.org/10.1016/j.jvcir.2015.08.007
- B. Glocker et al., Real-time RGB-D camera relocalization via randomized ferns for keyframe encoding, IEEE Trans. Visual. Comput. Graph. 21 (2015), 571-583. https://doi.org/10.1109/TVCG.2014.2360403
- M. Grupp, evo: Python package for the evaluation of odometry and slam, 2017, https://github.com/MichaelGrupp/evo