DOI QR코드

DOI QR Code

A Low-Cost Lidar Sensor based Glass Feature Extraction Method for an Accurate Map Representation using Statistical Moments

통계적 모멘트를 이용한 정확한 환경 지도 표현을 위한 저가 라이다 센서 기반 유리 특징점 추출 기법

  • An, Ye Chan (School of Electronic Engineering, Kumoh National Institute of Technology) ;
  • Lee, Seung Hwan (School of Electronic Engineering, Kumoh National Institute of Technology)
  • Received : 2020.12.15
  • Accepted : 2021.02.18
  • Published : 2021.05.31

Abstract

This study addresses a low-cost lidar sensor-based glass feature extraction method for an accurate map representation using statistical moments, i.e. the mean and variance. Since the low-cost lidar sensor produces range-only data without intensity and multi-echo data, there are some difficulties in detecting glass-like objects. In this study, a principle that an incidence angle of a ray emitted from the lidar with respect to a glass surface is close to zero degrees is concerned for glass detection. Besides, all sensor data are preprocessed and clustered, which is represented using statistical moments as glass feature candidates. Glass features are selected among the candidates according to several conditions based on the principle and geometric relation in the global coordinate system. The accumulated glass features are classified according to the distance, which is lastly represented on the map. Several experiments were conducted in glass environments. The results showed that the proposed method accurately extracted and represented glass windows using proper parameters. The parameters were empirically designed and carefully analyzed. In future work, we will implement and perform the conventional SLAM algorithms combined with our glass feature extraction method in glass environments.

Keywords

Acknowledgement

This research was supported by Kumoh National Institute of Technology (2018-104-005)

References

  1. C. Cadena, L. Carlone, H. Carrillo, Y. Latif, D. Scaramuzza, J. Neira, I. Reid, and J. J. Leonard, "Past, present, and future of simultaneous localization and mapping: Toward the robustperception age," IEEE Transactions on robotics, vol. 32, no. 6, pp. 1309-1332, Dec., 2016, DOI: 10.1109/tro.2016.2624754.
  2. H. Wei, X. Li, Y. Shi, B. You, and Y. Xu, "Multi-sensor Fusion Glass Detection for Robot Navigation and Mapping," 2018 WRC Symposium on Advanced Robotics and Automation (WRC SARA), Beijing, China, pp.184-188, 2018, DOI: 10.1109/wrc-sara.2018.8584213.
  3. J. Kim, K. K. Kwon, S. I. Lee, "Trends and Applications on Lidar Sensor Technology," Electronics and Telecommunications Trends, vol. 27, no. 6, pp.134-143, 2012.
  4. K. Maatta, J. Kostamovaara, and R. Myllyla, "Profiling of hot surfaces by pulsed time-of-flight laser range finder techniques," Applied Optics, vol. 32, no. 27, pp. 5334-5347, 1993, DOI: 10.1364/ao.32.005334.
  5. H. Yoon, H. Song, and K. Park, "A phase-shift laser scanner based on a time-counting method for high linearity performance," Review of Scientific Instruments, vol. 82, no. 7, pp. 1-4, 2011, DOI: 10.1063/1.3600456.
  6. SLAMTEC, Low-cost RPlidar sensor models, [Online], https://www.slamtec.com/en, Accessed: December 12, 2020.
  7. P. Foster, Z. Sun, J. J. Park, and B. Kuipers, "VisAGGE: Visible Angle Grid for Glass Environments," 2013 IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany, pp. 2213-2220, 2013, DOI: 10.1109/icra.2013.6630875.
  8. Y. Shih, D. Krishnan, F. Durand, and W. T. Freeman, "Reflection removal using ghosting cues," 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, pp. 3193-3201, 2015, DOI: 10.1109/cvpr.2015.7298939.
  9. C. Reymann and S. Lacroix, "Improving LiDAR Point Cloud Classification using Intensities and Multiple Echoes," 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, pp. 5122-5128, 2015, DOI: 10.1109/iros.2015.7354098.
  10. J. Yun and J. Y. Sim, "Reflection Removal for Large-Scale 3D Point Clouds," 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, pp. 4597-4605, 2018, DOI: 10.1109/cvpr.2018.00483.
  11. A. Diosi and L. Kleeman, "Advanced sonar and laser range finder fusion for simultaneous localization and mapping," 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Sendai, Japan, pp.1854-1859, 2004, DOI: 10.1109/iros.2004.1389667.
  12. Wikipedia, The definition of sensor fusion, [Online], https://en.wikipedia.org/wiki/Sensor_fusion, Accessed: December 12, 2020.
  13. S.-W. Yang and C.-C. Wang, "Dealing with Laser Scanner Failure: Mirrors and Windows," 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, pp. 3009-3015, 2008, DOI: 10.1109/robot.2008.4543667.
  14. S.-W. Yang and C.-C. Wang, "On solving mirror reflection in lidar sensing," IEEE/ASME Transactions on Mechatronics, vol. 16, no. 2, pp. 255-265, Apr., 2011, DOI: 10.1109/tmech.2010.2040113.
  15. Z. Huang, K. Wang, K. Yang, R. Cheng, and J. Bai, "Glass detection and recognition based on the fusion of ultrasonic sensor and RGB-D sensor for the visually impaired," Target and Background Signatures, Berlin, Germany, vol. 10794, 2018, DOI: 10.1117/12.2325496.
  16. X. Wang and J. G. Wang, "Detecting glass in Simultaneous Localisation and Mapping," Robotics and Autonomous Systems, vol. 88, pp. 97-103, Feb., 2017, DOI: 10.1016/j.robot.2016.11.003.
  17. X. Zhao, Z. Yang, and S. Schwertfeger, "Mapping with Reflection - Detection and Utilization of Reflection in 3D Lidar Scans," IEEE International Symposium on Safety, Security, Rescue Robotics (SSRR), Abudhabi, UAE, 2020, DOI: 10.1109/SSRR50563.2020.9292595.
  18. R. Koch, S. May, and A. Nuchter, "Effective Distinction Of Transparent And Specular Reflective Objects In Point Clouds Of A Multi-Echo Laser Scanner," 2017 18th International Conference on Advanced Robotics (ICAR), Hong Kong, China, pp. 566-571, 2017, DOI: 10.1109/icar.2017.8023667.
  19. R. Kocha, S. May, P. Murmann, and A. Nuchter, "Identification of transparent and specular reflective material in laser scans to discriminate affected measurements for faultless robotic SLAM," Robotics and Autonomous Systems, vol. 87, pp. 296-312, Jan., 2017, DOI: 10.1016/j.robot.2016.10.014.
  20. R. Koch, S. May, P. Koch, M. Kuhn, and A. Nuchter, "Detection of specular reflections in range measurements for faultless robotic SLAM," Robot 2015: Second Iberian Robotics Conference, Lisbon, Portugal, pp. 133-145, 2016, DOI: 10.1007/978-3-319-27146-0_11.
  21. J. Kim and W. Chung, "Robust Localization of Mobile Robots Considering Reliability of LiDAR Measurements," 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, pp. 6491-6496, 2018, DOI: 10.1109/icra.2018.8460648.
  22. J. Gong, Y. Duan, Y. Man, and G. Xiong, "VPH+: An Enhanced Vector Polar Histogram Method for Mobile Robot Obstacle Avoidance," 2007 IEEE International Conference on Mechatronics and Automation, Harbin, China, pp. 2784-2788, 2007, DOI: 10.1109/icma.2007.4304000.
  23. P. J. Besl and N. D. McKay, "Method for Registration of 3-D Shapes," Sensor Fusion IV: Control Paradigms and Data Structures, vol. 1611, 1992, DOI: 10.1117/12.57955.
  24. S. H. Lee, H. J. Kim, and B. H. Lee, "An Efficient Rescue System with Online Multi-Agent SLAM Framework," Sensors, vol. 20, no. 1, pp. 1-17, 2020, DOI: 10.3390/s20010235.