DOI QR코드

DOI QR Code

UGR Detection and Tracking in Aerial Images from UFR for Remote Control

비행로봇의 항공 영상 온라인 학습을 통한 지상로봇 검출 및 추적

  • Kim, Seung-Hun (Intelligent Robotics Research Center, Korea Electronics Technology Institute) ;
  • Jung, Il-Kyun (Intelligent Robotics Research Center, Korea Electronics Technology Institute)
  • Received : 2014.12.01
  • Accepted : 2015.04.23
  • Published : 2015.05.31

Abstract

In this paper, we proposed visual information to provide a highly maneuverable system for a tele-operator. The visual information image is bird's eye view from UFR(Unmanned Flying Robot) shows around UGR(Unmanned Ground Robot). We need UGV detection and tracking method for UFR following UGR always. The proposed system uses TLD(Tracking Learning Detection) method to rapidly and robustly estimate the motion of the new detected UGR between consecutive frames. The TLD system trains an on-line UGR detector for the tracked UGR. The proposed system uses the extended Kalman filter in order to enhance the performance of the tracker. As a result, we provided the tele-operator with the visual information for convenient control.

Keywords

References

  1. J. P. Lewis, "Fast Template Matching," Vision Interface, p. 120-123, 1995.
  2. P. Viola and M. Jones, "Rapid object detection using a boosted cascade of simple features," in Proc. CVPR conf., vol. I, pp. 511-518, 2001
  3. Z. Kalal, K. Mikolajczyk, and J. Matas, "Tracking- Learning-Detection," IEEE trans. Pattern Analysis and Machine Intelligence, vol. 34, no. 7, Jul. 2012.
  4. Z. Kalal, J. Matas, and K. Mikolajczyk, "P-N learning: Bootstrapping binary classifiers by structural constraints," in Proc. CVPR conf., pp. 49-56, Jun. 2010.
  5. http://www.msss.com/all_projects/msl-mahli.php
  6. http://www.fujitsu.com/downloads/MICRO/fma/pdf/360_OmniView_AppNote.pdf
  7. http://www.nrec.ri.cmu.edu/projects/sacr/
  8. http://www.uas-europe.se/index.php/products/skyview-ground-control-station-software
  9. J. Shi and C. Tomasi, "Good features to track," in Proc. CVPR conf., pp. 593-600, Jun. 1994.
  10. Z. Kalal, K. Mikolajczyk, and J. Matas, "Forward- Backward Error:Automatic Detection of Tracking Failuers," International Conference on Pattern Recognition, pp. 23-26, 2010.
  11. V. Lepetit and P. Fua, "Keypoint recognition using randomized trees.," IEEE transactions on pattern analysis and machine intelligence, vol. 28,pp. 1465-79, Sept. 2006. https://doi.org/10.1109/TPAMI.2006.188
  12. M. Ozuysal, P. Fua, and V. Lepetit, "Fast Keypoint Recognition in Ten Lines of Code," Conference on Computer Vision and Pattern Recognition, 2007.
  13. M. Calonder, V. Lepetit, and P. Fua, "BRIEF : Binary Robust Independent Elementary Features," European Conference on Computer Vision, 2010.
  14. L. Breiman, "Random forests," Machine Learning, vol. 45, no. 1, pp. 5-32, 2001. https://doi.org/10.1023/A:1010933404324
  15. M. Brown and D. Lowe, "Recognising Panoramas," Proc. Ninth Int'l Conf. Computer Vision, pp. 1218-1227, 2003. International Joint Technology Development Project
  16. Bo Gil Seo, Yungeun Choe, Hyun Chul Roh, Myung Jin Chung, "Graph-based Segmentation for Scene Understanding of an Autonomous Vehicle in Urban Environments", Journal of Korea Robotics Society (2014) 9(1):001-010 https://doi.org/10.7746/jkros.2014.9.1.001
  17. Sungsik Huh, Sungwook Cho, David Hyunchul Shim, "3-D Indoor Navigation and Autonomous Flight of a Micro Aerial Vehicle using a Low-cost LIDAR", Journal of Korea Robotics Society (2014) 9(3):154-159 https://doi.org/10.7746/jkros.2014.9.3.154

Cited by

  1. 2D-3D Pose Estimation using Multi-view Object Co-segmentation vol.12, pp.1, 2017, https://doi.org/10.7746/jkros.2017.12.1.033