DOI QR코드

DOI QR Code

Study on 2.5D Map Building and Map Merging Method for Rescue Robot Navigation

재난 구조용 로봇의 자율주행을 위한 지도작성 및 2.5D 지도정합에 관한 연구

  • Kim, Su Ho (Department of Mechatronics Engineering, TECH UNIVERSITY OF KOREA) ;
  • Shim, Jae Hong (Department of Mechatronics Engineering, TECH UNIVERSITY OF KOREA)
  • 김수호 (한국공학대학교 메카트로닉스공학부) ;
  • 심재홍 (한국공학대학교 메카트로닉스공학부)
  • Received : 2022.03.07
  • Accepted : 2022.03.21
  • Published : 2022.04.30

Abstract

The purpose of this study was to investigate the possibility of increasing the efficiency of disaster relief rescue operations through collaboration among multiple aerial and ground robots. The robots create 2.5D maps, which are merged into a 2.5D map. The 2.5D map can be handled by a low-specification controller of an aerial robot and is suitable for ground robot navigation. For localization of the aerial robot, a six-degree-of-freedom pose recognition method using VIO was applied. To build a 2.5D map, an image conversion technique was employed. In addition, to merge 2.5D maps, an image similarity calculation technique based on the features on a wall was used. Localization and navigation were performed using a ground robot to evaluate the reliability of the 2.5D map. As a result, it was possible to estimate the location with an average and standard error of less than 0.3 m for the place where the 2.5D map was normally built, and there were only four collisions for the obstacle with the smallest volume. Based on the 2.5D map building and map merging system for the aerial robot used in this study, it is expected that disaster response work efficiency can be improved by combining the advantages of heterogeneous robots.

Keywords

Acknowledgement

이 논문은 2019년도 한국연구재단 기본연구지원사업, 경기도 지역협력연구센터(GRRC) 사업(다중소재 가공기술 혁신연구센터)의 지원에 의하여 연구되었음, No.2019R1F1A1061579, GRRC 2020-B02

References

  1. Murphy, R. R., Tadokoro, S., Nardi, D., Jacoff, A., Fiorini, P., Choset, H. and Erkmen, A. M., Search and Rescue Robotics, Springer Handbook of Robotics, 2008.
  2. Murphy, R. R., Tadokoro, S. and Kleiner, A., Disaster Robotics, Springer Handbook of Robotics, 2016.
  3. Gautam, A. and Mohan, S., "A Review of Research in Multi-robot Systems", In Proceedings of the 2012 IEEE 7th International Conference on Industrial and Information Systems, 2012.
  4. Queralta, J. P., Taipalmaa, J., Pullinen, B. C., Sarker, V. K., Gia, T. N., Tenhunen, H., Gabbouj, M., Raitoharju, J. and Westerlund, T., Collaborative Multi-Robot Systems for Search and Rescue: Coordination and Perception, pp. 1-28, 2020.
  5. Akin, H. L., Ito, N., Jacoff, A., Kleiner, A., Pellenz, J. and Visser, A., RoboCup rescue robot and simulation leagues. AI Mag. Vol. 34, pp. 78-86, 2013. https://doi.org/10.1609/aimag.v34i1.2458
  6. Nam, T. H., Shim, J. H. and Cho, Y. I., "A 2.5D Map-based Mobile Robot Localization via Cooperation of Aerial and Ground Robots", Sensors(Switzerland), Vol. 17, pp. 1-24, 2017. https://doi.org/10.3390/s17010001
  7. Elfes, A. Occupancy Grids: A Stochastic Spatial Representation for Active Robot Perception, 2013.
  8. Stachniss, C., Leonard, J. J. and Thrun, S., Simultaneous Localization and Mapping, Springer Handbook of Robotics; 2016.
  9. Huang, B., Zhao, J. and Liu, J., A Survey of Simultaneous Localization and Mapping with an Envision in 6G Wireless Networks, pp. 1-17, 2019.
  10. Chong, T .J., Tang, X. J., Leng, C. H., Yogeswaran, M., Ng, O. E. and Chong, Y. Z., "Sensor Technologies and Simultaneous Localization and Mapping (SLAM)", Proceedings of the Procedia Computer Science, 2015.
  11. Filipenko, M. and Afanasyev, I., "Comparison of Various SLAM Systems for Mobile Robot in an Indoor Environment", Proceedings of the 9th International Conference on Intelligent Systems 2018: Theory, Research and Innovation in Applications, 2018.
  12. He, M., Zhu, C., Huang, Q., Ren, B. and Liu, J., "A review of monocular visual odometry", Vis. Comput., Vol. 36, pp. 1053-1065, 2020. https://doi.org/10.1007/s00371-019-01714-6
  13. Forster, C., Zhang, Z., Gassner, M., Werlberger, M. and Scaramuzza, D., "SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems", IEEE Trans. Robot. Vol. 33, pp. 249-265, 2017. https://doi.org/10.1109/TRO.2016.2623335
  14. Engel, J., Koltun, V. and Cremers, D., "Direct Sparse Odometry", IEEE Trans. Pattern Anal. Mach. Intell. Vol. 40, pp. 611-625, 2018. https://doi.org/10.1109/TPAMI.2017.2658577
  15. Mur-Artal, R. and Tardos, J. D., "ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras", IEEE Trans. Robot. Vol. 33, pp. 1255-1262, 2017. https://doi.org/10.1109/TRO.2017.2705103
  16. Strasdat, H.; Montiel, J.M.M.; Davison, A.J. Scale drift-aware large scale monocular SLAM. In Proceedings of the Robotics: Science and Systems; MIT Press Journals, 2011; Vol. 6, pp. 73-80.
  17. Delmerico, J. and Scaramuzza, D., "A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots", IEEE International Conference on Robotics and Automation, pp. 2502-2509, 2018.
  18. Mourikis, A. I. and Roumeliotis, S. I., "A Multi-state Constraint Kalman Filter for Vision-aided Inertial Navigation", IEEE International Conference on Robotics and Automation, pp. 3565-3572, 2007.
  19. Leutenegger, S., Furgale, P., Rabaud, V., Chli, M., Konolige, K. and Siegwart, R., Keyframe-Based Visual-Inertial SLAM using Nonlinear Optimization, 2016.
  20. Bloesch, M., Omari, S., Hutter, M. and Siegwart, R., "Robust Visual Inertial Odometry using a Direct EKF-based Approach", IEEE International Conference on Intelligent Robots and Systems; pp. 298-304, 2015.
  21. Qin, T., Li, P. and Shen, S., "VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator", IEEE Trans. Robot, Vol. 34, pp. 1004-1020, 2018. https://doi.org/10.1109/tro.2018.2853729
  22. Lynen, S., Achtelik, M. W., Weiss, S. and Chli, M. and Siegwart, R., "A Robust and Modular Multi-sensor Fusion Approach Applied to MAV Navigation", IEEE International Conference on Intelligent Robots and Systems, pp. 3923-3929, 2013.
  23. Forster, C., Carlone, L., Dellaert, F. and Scaramuzza, D., "On-Manifold Preintegration for Real-Time Visual-Inertial Odometry", IEEE Trans. Robot., Vol. 33, pp. 1-21, 2017. https://doi.org/10.1109/TRO.2016.2597321
  24. Geneva, P., Eckenhoff, K., Lee, W., Yang, Y. and Huang, G., "OpenVINS: A Research Platform for Visual-Inertial Estimation", IEEE International Conference on Robotics and Automation, pp. 4666-4672, 2020.
  25. Zhao, Y., Liu, G., Tian, G., Luo, Y., Wang, Z., Zhang, W. and Li, J., "A Survey of Visual SLAM Based on Deep Learning", Jiqiren/Robot, 2017.
  26. Thrun, S., "Robotic Mapping: A Survey", Science (80-. ). Vol. 298, pp. 1-35, 2002. https://doi.org/10.1126/science.298.5592.1
  27. Andersone, I., Heterogeneous Map Merging: State of the art, Robotics, 2019.
  28. Thrun, S., "A Probabilistic On-line Mapping Algorithm for Teams of Mobile Robots", Int. J. Rob. Res., Vol. 20, pp. 335-363, 2001. https://doi.org/10.1177/02783640122067435
  29. Horner, J., Map-merging for Multi-robot System, 2016.
  30. Rublee, E., Rabaud, V., Konolige, K. and Bradski, G., "ORB: An Efficient Alternative to SIFT or SURF", IEEE International Conference on Computer Vision, 2011.
  31. Brown, M., Image Stitching, Computer Vision, 2020.
  32. Alnounou, Y., Paulik, M.J., Krishnan, M. and Hudas, G. and Overholt, J., "Occupancy Grid Map Merging using Feature Maps", International Conference on Robotics and Applications, pp. 469-475, 2010;
  33. UMEDA, M., Hough Transform, J. Japan Soc. Fuzzy Theory Syst., 1996.
  34. Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R. and Ng, A. Y., "ROS: An Open-source Robot Operating System", ICRA workshop on open source software, 2009.
  35. Hornung, A., Wurm, K. M., Bennewitz, M., Stachniss, C. and Burgard, W., "OctoMap: An efficient probabilistic 3D mapping framework based on octrees.", Auton. Robots, Vol. 34, pp. 189-206, 2013. https://doi.org/10.1007/s10514-012-9321-0
  36. Majumder, A., Gopi, M., Majumder, A. and Gopi, M., The Pinhole Camera, Introduction to Visual Computing, 2019.
  37. Lee, J. H., Lee, S., Zhang, G., Lim, J., Chung, W. K. and Suh, I. H., "Outdoor Place Recognition in Urban Environments using Straight Lines", IEEE International Conference on Robotics and Automation, pp. 5550-5557, 2014.
  38. Green, B, Canny edge detection, Retrieved, 2009.
  39. Kuruppu, G., Manoj, C., Kodituwakku, S. R. and Pinidiyaarachchi, U. A. J., "Comparison of Different Template Matching Algorithms in High Speed Sports Motion Tracking", IEEE 8th Int. Conf. Ind. Inf. Syst., pp. 445-448, 2013.
  40. Fox, D., Thrun, S., Burgard, W. and Dellaert, F., Particle Filters for Mobile Robot Localization, Sequential Monte Carlo Methods in Practice, 2001.
  41. Hess, W., Kohler, D., Rapp, H. and Andor, D., "Real-time loop closure in 2D LIDAR SLAM", IEEE International Conference on Robotics and Automation, 2016.
  42. Hart, P. E., Nilsson, N. J. and Raphael, B., "A Formal Basis for the Heuristic Determination of Minimum Cost Paths", IEEE Trans. Syst. Sci. Cybern., 1968.