DOI QR코드

DOI QR Code

Bayesian Sensor Fusion of Monocular Vision and Laser Structured Light Sensor for Robust Localization of a Mobile Robot

이동 로봇의 강인 위치 추정을 위한 단안 비젼 센서와 레이저 구조광 센서의 베이시안 센서융합

  • 김민영 (경북대학교 전자전기컴퓨터학부) ;
  • 안상태 (국방과학연구소) ;
  • 조형석 (한국과학기술원 기계공학과)
  • Received : 2009.06.09
  • Accepted : 2010.01.10
  • Published : 2010.04.01

Abstract

This paper describes a procedure of the map-based localization for mobile robots by using a sensor fusion technique in structured environments. A combination of various sensors with different characteristics and limited sensibility has advantages in view of complementariness and cooperation to obtain better information on the environment. In this paper, for robust self-localization of a mobile robot with a monocular camera and a laser structured light sensor, environment information acquired from two sensors is combined and fused by a Bayesian sensor fusion technique based on the probabilistic reliability function of each sensor predefined through experiments. For the self-localization using the monocular vision, the robot utilizes image features consisting of vertical edge lines from input camera images, and they are used as natural landmark points in self-localization process. However, in case of using the laser structured light sensor, it utilizes geometrical features composed of corners and planes as natural landmark shapes during this process, which are extracted from range data at a constant height from the navigation floor. Although only each feature group of them is sometimes useful to localize mobile robots, all features from the two sensors are simultaneously used and fused in term of information for reliable localization under various environment conditions. To verify the advantage of using multi-sensor fusion, a series of experiments are performed, and experimental results are discussed in detail.

Keywords

References

  1. H. J. Sohn and B. K. Kim, "A robust localization algorithm for mobile robots with laser range finders," Proc. of 2005 IEEE ICRA, pp. 3545-3550, 2005.
  2. D. Brscic and H. Hashimoto, "Model based robot localization using onboard and distributed laser range finders," Proc. of 2008 IEE/RSJIROS, pp.1154-1159, 2008.
  3. D. M. Cole and P. M. Newman, "Using laser range data for 3D SLAM in outdoor environments," Proc. of 2006 IEEE ICRA, pp. 1556-1563, 2006.
  4. M. Jung, H. Myung, H. Lee, and S. Bang, "Ambiguity resolving in structured light 2D range finder for SLAM operation for home robot applications," Proc. of 2005 IEEE Workshop on Advanced Robotics and its Social Impacts, pp. 18-23, 2005.
  5. 김민근, 레이저 비전 센서를 이용한 순차적인 자율 지도 작성과 자기 위치 추정 기반의 이동로봇 주행 방법, 석사 학위 논문, 기계공학과, 한국과학기술원, 2003.
  6. M. H. Kim, S. C. Lee, and K. H. Lee, "Self-localization of mobile robot with single camera in corridor environment," Proc. of 2001 IEEE International Symposium on Industrial Electronics, vol. 3, pp. 12-16, 2001.
  7. M. Zaman, "High precision relative localization using a single camera," Proc. of 2007 IEEE ICRA, pp. 3908-3914, 2007.
  8. A. J. Davison, I. D. Reid, N. D. Molton, and O. Stasse, "Mono SLAM: Real-Time Single Camera SLAM," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 6, pp. 1052-1067, 2007. https://doi.org/10.1109/TPAMI.2007.1049
  9. O. Wijk and H. I. Christensen, "Localization and navigation of a mobile robot using natural point landmarks extracted from sonar data," Robotics and Autonomous Systems, 31, pp. 31-42, 2000. https://doi.org/10.1016/S0921-8890(99)00085-8
  10. H. Kim, J. Choi, and M. Park, "Indoor localization system using multi-modulation of ultrasonic sensors and digital compass," Proc. of 2008 IEE/RSJIROS, pp. 1359-1364, 2008.
  11. A. Burguera, G. Oliver, and J. D. Tardos, "Robust scan matching localization using ultrasonic range finders," Proc. of 2005 IEEE/RSJ IROS, pp. 1367-1372, 2005.
  12. K. O. Arras and N. Tomatis, "Improving robustness and precision in mobile robot localization by using laser range finding and monocular vision," Proc. of 3rd European Workshop on Advanced Mobile Robots, pp. 177-185, 6-8, September 1999.
  13. 채수용, 센서융합을 이용한 실내주행 이동로봇의 위치 추정 정확도 개선, 석사 학위 논문, 전기 및 전자공학과, 한국과학기술원, 2000.
  14. A. Diosi and L. Kleeman, "Advanced sonar and laser range finder fusion for simultaneous localization and mapping," Proc. of 2004 IEEE/RSJIROS, pp. 1854-1859, 2004.
  15. K. O. Arras, N. Tomatis, B. T. Jensen, and R. Siegwart, "Multisensor on-the-fly localization: Precision and reliability for applications," Robotics and Autonomous Systems, vol. 34, no. 2-3, pp. 131-143, 2001. https://doi.org/10.1016/S0921-8890(00)00117-2
  16. K. Sugihara, "Some location problems for robot navigation using a single camera," Computer vision, graphics, and image processing 42, pp. 112-129, 1988. https://doi.org/10.1016/0734-189X(88)90145-4
  17. L. A. Klein, Sensor and Data Fusion: A Tool for Information Assessment and Decision Making, vol. PM138, SPIE Publications, 2004.
  18. J. K. Hackett and M. Shah., "Multi-sensor fusion: a perspective,"Proc. of IEEE Int. Conf. Robotics and Automation, pp. 1324-1330, 1990.

Cited by

  1. A Taxonomy of Vision Systems for Ground Mobile Robots vol.11, pp.7, 2014, https://doi.org/10.5772/58900
  2. Advances in sensing and processing methods for three-dimensional robot vision vol.15, pp.2, 2018, https://doi.org/10.1177/1729881418760623