DOI QR코드

DOI QR Code

Extended and Adaptive Inverse Perspective Mapping for Ground Representation of Autonomous Mobile Robot

모바일 자율 주행 로봇의 지면 표현을 위한 확장된 적응형 역투영 맵핑 방법

  • Jooyong Park (Department of Electrical Engineering, Inha University) ;
  • Younggun Cho (Department of Electrical Engineering, Inha University)
  • Received : 2022.10.31
  • Accepted : 2022.12.25
  • Published : 2023.02.28

Abstract

This paper proposes an Extended and Adaptive Inverse Perspective Mapping (EA-IPM) model that can obtain an accurate bird's-eye view (BEV) from the forward-looking monocular camera on the sidewalk with various curves. While Inverse Perspective Mapping (IPM) is a good way to obtain ground information, conventional methods assume a fixed relationship between the camera and the ground. Due to the nature of the driving environment of the mobile robot, there are more walking environments with frequent motion changes than flat roads, which have a fatal effect on IPM results. Therefore, we have developed an extended IPM process to be applicable in IPM on sidewalks by adding a formula for complementary Y-derive processes and roll motions to the existing adaptive IPM model that is robust to pitch motions. To convince the performance of the proposed method, we evaluated our results on both synthetic and real road and sidewalk datasets.

Keywords

Acknowledgement

This work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No.2022-0-00448, Deep Total Recall), National Research Foundation of Korea(NRF) grant funded by the Korea government (MSIT) (No.2022R1A4A3029480). Korea Institute for Advancement of Technology (KIAT) grant funded by the Korea Government (MOTIE) (P0017124), and This work was supported by the Technological Innovation R&D Program (S3250054) funded by the Ministry of SMEs and Startups (MSS, Korea), and Inha University

References

  1. J. Jeong, Y. Cho, and A. Kim, "Road-SLAM : Road marking based SLAM with lane-level accuracy," IEEE Symposium on Intelligent Vehicle, Los Angeles, USA, 2017, DOI: 10.1109/IVS.2017.7995958.
  2. T. Qin, T. Chen, Y. Chen, and Q. Su, "AVP-SLAM: Semantic Visual Mapping and Localization for Autonomous Vehicles in the Parking Lot," IEEE International Conference on Intelligent Robots and Systems (IROS), Las Vegas, USA, 2020, DOI: 10.1109/IROS45743.2020.9340939.
  3. Y. Lu, J. Huang, Y.-T. Chen, and B. Heisele, "Monocular localization in urban environments using road markings," IEEE Symposium on Intelligent Vehicle, Los Angeles, USA, 2017, DOI: 10.1109/IVS.2017.7995762.
  4. Y. Zhou, X. Li, S. Li, and X. Wang, "Visual Mapping and Localization System Based on Compact Instance-Level Road Markings With Spatial Uncertainty," IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 10802-10809, Oct., 2022, DOI: 10.1109/LRA.2022.3196470.
  5. W. Yang, B. Fang, and Y. Y. Tang, "Fast and Accurate Vanishing Point Detection and Its Application in Inverse Perspective Mapping of Structured Road," IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 48, no. 5, pp. 755-766, May., 2018, DOI: 10.1109/TSMC.2016.2616490.
  6. Z. Ying and G. Li, "Robust lane marking detection using boundary-based inverse perspective mapping," International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Shanghai, China, 2016, DOI: 10.1109/ICASSP.2016.7472011.
  7. J. Jeong and A. Kim, "Adaptive Inverse Perspective Mapping for lane map generation with SLAM," International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), Xi'an, China, 2016, DOI: 10.1109/URAI.2016.7734016.
  8. A. Handa, T. Whelan, J. McDonald, and A. J. Davison, "A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM," IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 2014, DOI: 10.1109/ICRA.2014.6907054.