DOI QR코드

DOI QR Code

Implementation of a sensor fusion system for autonomous guided robot navigation in outdoor environments

실외 자율 로봇 주행을 위한 센서 퓨전 시스템 구현

  • Lee, Seung-H. (Department of Electrical Engineering, Seoul National University) ;
  • Lee, Heon-C. (Department of Electrical Engineering, Seoul National University) ;
  • Lee, Beom-H. (Department of Electrical Engineering, Seoul National University)
  • 이승환 (서울대학교 전기컴퓨터공학부) ;
  • 이헌철 (서울대학교 전기컴퓨터공학부) ;
  • 이범희 (서울대학교 전기컴퓨터공학부)
  • Received : 2009.12.04
  • Accepted : 2010.05.18
  • Published : 2010.05.31

Abstract

Autonomous guided robot navigation which consists of following unknown paths and avoiding unknown obstacles has been a fundamental technique for unmanned robots in outdoor environments. The unknown path following requires techniques such as path recognition, path planning, and robot pose estimation. In this paper, we propose a novel sensor fusion system for autonomous guided robot navigation in outdoor environments. The proposed system consists of three monocular cameras and an array of nine infrared range sensors. The two cameras equipped on the robot's right and left sides are used to recognize unknown paths and estimate relative robot pose on these paths through bayesian sensor fusion method, and the other camera equipped at the front of the robot is used to recognize abrupt curves and unknown obstacles. The infrared range sensor array is used to improve the robustness of obstacle avoidance. The forward camera and the infrared range sensor array are fused through rule-based method for obstacle avoidance. Experiments in outdoor environments show the mobile robot with the proposed sensor fusion system performed successfully real-time autonomous guided navigation.

Keywords

References

  1. K. Ohno and T. Tsubouchi, “A mobile robot campus walkway following with daylight-change-proof walkway color image segmentation”, Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Hawaii, USA, 2001.
  2. A. Diosi, A. Remazeilles, S. Segvic, and F. Chaumette, “Outdoor visual path following experiments”, IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS’ 07, San Diego, CA, USA, 29 October - 2 November 2007.
  3. A. James and J. Baltes, “A path following system for autonomous robots with minimal computing power”, University of Auckland, Private Bag 92019, Auckland, New Zealand, 2001.
  4. M. Sotelo, F. Rodriguez, L. Magdalena, L. Bergasa, and L. Boquete, “A color vision-based lane tracking system for autonomous driving on unmarked roads”, Autonomous Robots, vol. 16, no. 1, pp. 95-116, 2004. https://doi.org/10.1023/B:AURO.0000008673.96984.28
  5. K. Kluge and C. Thorpe, “Representation and recovery of road geometry in YARF”, Proceedings of the Intelligent Vehicles ‘92 Symposium, pp. 114-119, 1992.
  6. 김갑순, “초음파 센서를 이용한 로봇의 소형장애물 감지 및 회피방법 연구”, 센서학회지, 제14권, 제2호, pp. 101-108, 2005.
  7. P.E. Trahanias, S. Velissaris, and S.C. Orphanoudakis, “Visual recognition of workspace landmarks for topological navigation”, Autonomous Robots, vol. 7, no. 2, pp. 143-158, 1999. https://doi.org/10.1023/A:1008910100968
  8. Yunsu Bok, Youngbae Hwang, and InSo Kweon, “영상 매칭 및 자세 추정을 이용한 무인 차량의 위치 추정(UGV localization based on scene matching and pose estimation)”, 한국군사과학기술학회 종합학술대회, pp. 1144-1150, 2007.
  9. K. Macek, B. Williams, S. Kolski, and R. Siegwart, “A lane detection vision module for driver assistance”, Proceedings of the IEEE/APS Conference on Mechatronics and Robotics, Aachen, Germany, 2004.
  10. Andrew Reed Bacha, “Line detection and lane following for an autonomous mobile robot”, Master’s Thesis, Virginia Polytechnic Institute and State University, 2005.
  11. McKeon, R.T, Paulik, M., and Krishnan, M, “Lane identification and path planning for autonomous mobile robots”, Proceedings of SPIE - The International Society for Optical Engineering, vol. 6384, 63840S, 2006.
  12. Rezoug, A., Djouadi, and M.S, “Visual based lane following for non-holonomic mobile robot”, IEEE EUROCON 2009, no. 5167741, pp. 902-907, 2009.
  13. J. Cao, X. Liao, and E. Hall, “Reactive navigation for autonomous guided vehicle using the neuro-fuzzy techniques”, Proceedings of SPIE - The International Society for Optical Engineering, vol. 3837, pp. 108-117, Boston, 1999.
  14. Y.U Yim and S-Y Oh, “Three-feature based automatic lane detection algorithm(TFALDA) for autonomous driving”, IEEE Transactions on Intelligent Transportation Systems, pp. 219-225, 2003.
  15. Li W, Lu G.T, and Wang Y. Q, “Recognizing white line markings for vision-guided vehicle navigation by fuzzy reasoning”, Pattern Recognition Letters, vol. 18, pp. 771-780, 1997. https://doi.org/10.1016/S0167-8655(97)00051-2
  16. Kahn P, Kitchen L, and Riseman E M, “A fast line finder for vision-guided robot navigation”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 12, no. 11, pp. 1098-1102, 1990. https://doi.org/10.1109/34.61710
  17. Pinheiro, P, and Lima, P, “Bayesian sensor fusion for cooperative object localization and world modeling”, 8th Conference on Intelligent Autonomous Systems (IAS-8), Amsterdam, Netherlands, 2004.
  18. Sasladek, J. Z and Q. Wang, “Sensor fusion based on fuzzy Kalman filtering for autonomous robot vehicle”, Proceedings of the IEEE. International Conference on Robotics and Automation, Detroit, Michigan, pp. 2970-2975, 1999.
  19. Kam, M., Zhu, X., and Kalata, P, “Sensor fusion for mobile robot navigation”, Proceedings of the IEEE, vol. 85, pp. 108-119, 1997. https://doi.org/10.1109/JPROC.1997.554212
  20. 강정호, 김창걸, 이승하, 송병섭, “시각장애인을 위한 보행보조 로봇의 개발”, 센서학회지, 제16권, 제4호, pp. 286-293, 2007. https://doi.org/10.5369/JSST.2007.16.4.286