DOI QR코드

DOI QR Code

Survey on Visual Navigation Technology for Unmanned Systems

무인 시스템의 자율 주행을 위한 영상기반 항법기술 동향

  • Kim, Hyoun-Jin (Department of Mechanical and Aerospace Engineering, Seoul National University) ;
  • Seo, Hoseong (Department of Mechanical and Aerospace Engineering, Seoul National University) ;
  • Kim, Pyojin (Department of Mechanical and Aerospace Engineering, Seoul National University) ;
  • Lee, Chung-Keun (Department of Mechanical and Aerospace Engineering, Seoul National University)
  • 김현진 (서울대학교 기계항공공학부) ;
  • 서호성 (서울대학교 기계항공공학부) ;
  • 김표진 (서울대학교 기계항공공학부) ;
  • 이충근 (서울대학교 기계항공공학부)
  • Received : 2015.04.14
  • Accepted : 2015.04.23
  • Published : 2015.04.30

Abstract

This paper surveys vision based autonomous navigation technologies for unmanned systems. Main branches of visual navigation technologies are visual servoing, visual odometry, and visual simultaneous localization and mapping (SLAM). Visual servoing provides velocity input which guides mobile system to desired pose. This input velocity is calculated from feature difference between desired image and acquired image. Visual odometry is the technology that estimates the relative pose between frames of consecutive image. This can improve the accuracy when compared with the exisiting dead-reckoning methods. Visual SLAM aims for constructing map of unknown environment and determining mobile system's location simultaneously, which is essential for operation of unmanned systems in unknown environments. The trend of visual navigation is grasped by examining foreign research cases related to visual navigation technology.

이 논문에서는 영상정보를 기반으로 한 무인 시스템의 자율 항법기술에 대한 동향을 요약한다. 영상기반 항법기술로는 비주얼 서보잉, 비주얼 오도메트리, 영상 기반 SLAM(simultaneous localization and mapping)이 있다. 비주얼 서보잉은 목표 이미지와 현재 이미지 사이의 피쳐 차이로부터 원하는 속도 입력을 계산하여 무인 로봇을 목표 자세로 유도하는 데 사용된다. 비주얼 오도메트리는 무인 시스템이 영상정보를 바탕으로 자신의 이동 궤적을 추정하는 기술로, 기존의 dead-reckoning 방식보다 정확성을 향상시킬 수 있다. 영상 기반 SLAM은 무인 시스템이 영상 정보를 활용하여 미지의 환경에 대한 지도를 구축함과 동시에 자신의 위치를 결정해 나가는 기술로, 정확히 알지 못하는 환경에서 무인차량이나 무인기를 운용하는데 필수적이다. 이러한 기술들이 적용된 해외의 연구 사례들을 살펴봄으로써 영상기반 항법기술의 동향을 파악할 수 있었다.

Keywords

References

  1. B. Font, A. Ortiz, and G. Oliver, "Visual navigation for mobile robots: A survey," Journal of Intelligent and Robotic systems, Vol. 53, No. 3, pp. 263-296, Nov. 2008. https://doi.org/10.1007/s10846-008-9235-4
  2. F. Chaumette and S. Hutchinson. "Visual servo control. I. Basic approaches," IEEE Robotics & Automation Magazine, Vol. 13, No. 4, pp. 82-90, Dec. 2006. https://doi.org/10.1109/MRA.2006.250573
  3. D. Scaramuzza, and F. Fraundorfer, "Visual odometry [tutorial]," IEEE Robotics & Automation Magazine, Vol. 18, No. 4, pp. 80-92, Dec. 2011. https://doi.org/10.1109/MRA.2011.943233
  4. H. Durrant-Whyte and T. Bailey, "Simultaneous localization and mapping: part I," IEEE Robotics & Automation Magazine, Vol. 13, No. 2, pp. 99-110, Jun. 2006.
  5. F. Chaumette and S. Hutchinson. "Visual servo control. ii. advanced approaches [tutorial]," IEEE Robotics & Automation Magazine, Vol. 14, No. 1, pp. 109-118, Mar. 2007.
  6. E. Marchand and F. Chaumette, "Feature tracking for visual servoing purposes," Robotics and Autonomous Systems, Vol. 52, No. 1, pp. 53-70, Jul. 2005 https://doi.org/10.1016/j.robot.2005.03.009
  7. F. Chaumette, "Image moments: a general and useful set of features for visual servoing," IEEE Transactions on Robotics, Vol. 20, No. 4, pp. 713-723, Aug. 2004. https://doi.org/10.1109/TRO.2004.829463
  8. Mariottini, G. Luca, G. Oriolo, and D. Prattichizzo. "Image-based visual servoing for nonholonomic mobile robots using epipolar geometry," IEEE Transactions on Robotics, Vol. 23, No. 1, pp. 87-100, Feb. 2007. https://doi.org/10.1109/TRO.2006.886842
  9. Y. Mezouar, and F. Chaumette. "Path planning for robust image-based control," IEEE Transactions on Robotics and Automation, Vol. 18, No. 4, pp. 534-549, Aug. 2002. https://doi.org/10.1109/TRA.2002.802218
  10. A. Remazeilles, and F. Chaumette. "Image-based robot navigation from an image memory," Robotics and Autonomous Systems, Vol. 55, No. 4, pp. 345-356, Apr. 2007. https://doi.org/10.1016/j.robot.2006.10.002
  11. A. Dame, and E. Marchand. "Mutual information-based visual servoing," IEEE Transactions on Robotics, Vol. 27, No. 5, pp. 958-969, Oct. 2011. https://doi.org/10.1109/TRO.2011.2147090
  12. J. Noah, D. Joel, and E. Koditschek. "Visual servoing via navigation functions," IEEE Transactions on Robotics and Automation, Vol. 18, No. 4, pp. 521-533, Aug. 2002. https://doi.org/10.1109/TRA.2002.802202
  13. V. Kallem et al. "Kernel-based visual servoing," in IEEE International Conference on Intelligent Robots and Systems (IROS), Can Diego: CA, pp. 1975-1980, Oct. 2007.
  14. O. Tahri, F. Chaumette, and Y. Mezouar. "New decoupled visual servoing scheme based on invariants from projection onto a sphere," in IEEE International Conference on Robotics and Automation (ICRA), Pasadena: CA, pp. 3238-3242. May. 2008.
  15. M. Maimone, Y. Cheng, and L. Matthies. "Two years of visual odometry on the mars exploration rovers," Journal of Field Robotics, Vol. 24, No. 3, pp. 169-186, Mar. 2007. https://doi.org/10.1002/rob.20184
  16. H. Albert et al. "Visual odometry and mapping for autonomous flight using an RGB-D camera," in International Symposium on Robotics Research (ISRR), Flagstaff: AZ, pp. 1320-1343, Sep. 2011.
  17. M. Blosch et al. "Vision based MAV navigation in unknown and unstructured environments," in IEEE International Conference on Robotics and Automation (ICRA), Anchorage: AK, pp. 21-28, May. 2010.
  18. K. Georg, and D. Murray. "Parallel tracking and mapping for small AR workspaces," in 6th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR), Nara: Japan, pp. 225-234, Nov. 2007.
  19. M. Fischler, and R. C. Bolles. "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography," Communications of the ACM, Vol. 24, No. 6, pp. 381-395, Jun. 1981. https://doi.org/10.1145/358669.358692
  20. M. Kaess, A. Ranganathan, and F. Dellaert. "iSAM: Incremental smoothing and mapping," IEEE Transactions on Robotics, Vol. 24, No. 6, pp. 1365-1378, 2008. https://doi.org/10.1109/TRO.2008.2006706
  21. A. Geiger, J. Ziegler, and C. Stiller. "Stereoscan: Dense 3d reconstruction in real-time," in IEEE Intelligent Vehicles Symposium (IV), Baden-Baden: Germany, pp. 963-968, Jun. 2011.
  22. H. Lim et al. "Real-time image-based 6-dof localization in large-scale environments," in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Providence: RI, pp. 1043-1050, Jun. 2012.
  23. R. A. Newcombe, S. J. Lovegrove, and A. J. Davison. "DTAM: Dense tracking and mapping in real-time," in IEEE International Conference on Computer Vision (ICCV), Barcelona: Spain, pp. 2320-2327, Nov. 2011.
  24. R. A. Newcombe et al. "KinectFusion: Real-time dense surface mapping and tracking," in IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Basel: Switzerland, pp. 127-136, Oct. 2011.
  25. C. Kerl, J. Sturm, and D. Cremers. "Robust odometry estimation for RGB-D cameras," in IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe: Germany, pp. 3748-3754, May. 2013.
  26. C. Forster, M. Pizzoli, and D. Scaramuzza. "SVO: Fast semi-direct monocular visual odometry," in IEEE International Conference on Robotics and Automation (ICRA), Hong Kong: China, pp. 15-22, May. 2014.
  27. J. Fuentes-Pacheco, J. Ruiz-Ascencio and J. Manuel Rendon-Mancha, "Visual simultaneous localization and mapping: a survey," Artificial Intelligence Review, Vol. 43, No. 1, pp. 55-81, Jan. 2015. https://doi.org/10.1007/s10462-012-9365-8
  28. J. Civera, J. Davison, and J. Montiel. "Inverse depth parametrization for monocular SLAM," IEEE Transactions on Robotics, Vol. 24, No. 5, pp. 932-945, Oct. 2008. https://doi.org/10.1109/TRO.2008.2003276
  29. T. Bailey, and H. Durrant-Whyte. "Simultaneous localization and mapping (SLAM): Part II," IEEE Robotics & Automation Magazine, Vol. 13, No. 3, pp. 108-117, Sep. 2006. https://doi.org/10.1109/MRA.2006.1678144
  30. A. Howard, "Multi-robot simultaneous localization and mapping using particle filters," The International Journal of Robotics Research, Vol. 25, No. 12, pp. 1243-1256, Dec. 2006. https://doi.org/10.1177/0278364906072250
  31. H. Choset, and K. Nagatani. "Topological simultaneous localization and mapping (SLAM): toward exact localization without explicit localization," IEEE Transactions on Robotics and Automation, Vol. 17, No. 2, pp. 125-137, Apr. 2001. https://doi.org/10.1109/70.928558
  32. C. Estrada, J. Neira, and J. D. Tardos. "Hierarchical SLAM: real-time accurate mapping of large environments," IEEE Transactions on Robotics, Vol. 21, No. 4, pp. 588-596, Aug. 2005. https://doi.org/10.1109/TRO.2005.844673
  33. H. P. Chiu et al. "Precise vision-aided aerial navigation," in IEEE International Conference on Intelligent Robots and Systems (IROS), Chicago: IL, pp. 688-695, Oct. 2014.
  34. F. Lindsten et al. "Geo-referencing for UAV navigation using environmental classification," in IEEE International Conference on Robotics and Automation (ICRA), Anchorage: AK, pp. 1420-1425, May. 2010.
  35. K. Schmid et al. "Stereo vision based indoor/outdoor navigation for flying robots," in IEEE International Conference on Intelligent Robots and Systems (IROS), Tokyo: Japan, pp. 3955-3962, Nov. 2013.
  36. J. Engel, J. Sturm, and D. Cremers. "Scale-aware navigation of a low-cost quadrocopter with a monocular camera," Robotics and Autonomous Systems, Vol. 62, No. 11, pp.1646-1656, Nov. 2014. https://doi.org/10.1016/j.robot.2014.03.012
  37. C. Forster, M. Pizzoli, and D. Scaramuzza. "Appearance based Active, Monocular, Dense Reconstruction for Micro Aerial Vehicle," in Robotics: Science and Systems Conference, Berkeley: CA, Jul. 2014.