DOI QR코드

DOI QR Code

Tiny Drone Tracking with a Moving Camera

동적 카메라 환경에서의 소형 드론 추적 방법

  • 손소희 (한밭대학교 정보통신전문대학원 멀티미디어공학과) ;
  • 전진우 (한국전자통신연구원) ;
  • 이인재 (한국전자통신연구원) ;
  • 차지훈 (한국전자통신연구원) ;
  • 최해철 (한밭대학교 정보통신전문대학원 멀티미디어공학과)
  • Received : 2019.05.22
  • Accepted : 2019.07.25
  • Published : 2019.09.30

Abstract

With the rapid development in the field of unmanned aerial vehicles(UAVs) and drones, higher request to development of a surveillance system for a drone is putting forward. Since surveillance systems with fixed cameras have a limited range, a development of surveillance systems with a moving camera applicable to PTZ(Pan-Tilt-Zoom) cameras is required. Selecting the features for object plays a critical role in tracking, and the object has to be represented by their shapes or appearances. Considering these conditions, in this paper, an object tracking method with optical flow is introduced to track a tiny drone with a moving camera. In addition, a tracking method combined with kalman filter is proposed to track continuously even when tracking is failed. Experiments are tested on sequences which have a target from the minimal 12 pixels to the maximal 56337 pixels, the proposed method achieves average precision of 175% improvement. Also, experimental results show the proposed method tracks a target which has a size of 12pixels.

최근 무인 비행체의 활용이 증가하면서 소형 드론의 활용 역시 크게 증가하고 있다. 이러한 시장의 성장으로 드론의 악용 가능성이 높아짐에 따라 드론을 적절히 통제하기 위한 감시 시스템의 필요성이 제기되고 있다. 또한 고정 카메라를 사용한 감시 시스템은 범위가 제한적이기 때문에 PTZ(Pan-Tilt-Zoom) 카메라 등에 적용 가능한 동적 카메라 환경에서의 객체 추적 연구가 필요하며, 실시간 추적을 위해 최적화된 객체 추적 연구가 필요하다. 효과적인 추적을 위해 대상 객체의 특징을 배경 환경에 맞추어 정의하거나 객체의 특징 정보를 효과적으로 추출해야 한다. 본 논문에서는 소형 드론 추적을 위해 옵티컬 플로우를 사용한 객체 추적 방법과 추적 실패로부터 재추적을 수행하기 위한 옵티컬 플로우와 칼만 필터를 결합한 방법을 소개한다. 본 논문에서는 추적 결과 비교를 위해 최소 12pixels에서 최대 56337pixels의 표적 크기에 대한 실험 결과를 보인다. 제안 방법은 기존 추적 방법과 비교하여 평균 175%의 정밀도 향상과 평균 143%의 검출률 향상 결과를 보였으며, 최소 12pixels의 표적에 대해서도 추적하는 결과를 보였다.

Keywords

References

  1. Y.-C. Choi and H.-S. Ahn, "Development Trends and Expectations of Drone Technology," The Korean Institute of Electrical Engineers, Vol.64(12), pp.20-25, December 2015.
  2. Y.-S. Lee, J.-M. Kim, Eun Kim, and M.-S. Jung "A Study on Flight Authentication of Small Unmanned Autonomous Vehicles for Safety in the Urban," Information and Control Symposium, pp.156-158, 2014.
  3. M. Valera and S.A. Velastin, "Intelligent distributed surveillance systems: a review," IEEE Proc.- Image and Signal Processing, Vol.152, Issue.2, pp.192-204, 2005. https://doi.org/10.1049/ip-vis:20041147
  4. A. Yilmaz, O. Javed and M. Shah, "Object tracking: A survey," ACM Computing Surveys, Vol.38, No.13, 2006.
  5. H. S. Parekh, D. G. Thakore, and U. K. Jaliya, "A Survey on Object Detection and Tracking Methods," International Journal of Innovative Research in Computer and Communication Engineering, Vol.2, Issue.2, pp.2970-2978, 2014.
  6. D. Prajapati and H. J. Galiyawala, "A Review on Moving Object Detection and Tracking," International Journal of Computer Application, 5(3):168-175, 2015.
  7. Ted J. Broida and Rama Chellappa, "Estimation of object motion parameters from noisy images," IEEE Transactions on Pattern Analysis & Machine Intelligence, Vol.8, pp.90-99, January 1986.
  8. Genshiro Kitagawa, "Non-Gaussian State-Space Modeling of Nonstationary Time Series," Journal of the American Statistical Association, Vol.82, No.400, pp. 1032-1041, December 1987. https://doi.org/10.2307/2289375
  9. D. Comaniciu, V. Ramesh, and P. Meer, "Real-time Tracking of Non-Rigid Objects Using Mean Shift," Proceedings IEEE Conference on Computer Vision and Pattern Recognition, Vol.2, pp.142-149, 2000.
  10. Gunnar Farneback, "Two-Frame Motion Estimation Based on Polynomial Expansion," Scandinavian conference on Image analysis, pp.363-370, Springer, Berlin, Heidelberg, 2003
  11. B. K. P. Horn and B. Schunck, "Determining Optical Flow," Artificial intelligence, Vol.17, pp.185-203, 1981. https://doi.org/10.1016/0004-3702(81)90024-2
  12. B. D. Lucas and T. Kanade, "An Iterative Image Registration Technique with an Application to Stereo Vision", International Joint Conference on Artificial Intelligence, Vol.81, pp.674-679, 1981.
  13. J. L. Barron, D. J. Fleet, and S. S. Beauchemin, "Performance of Optical Flow Techniques," International Journal of Computer Vision, Vol.12, pp.43-77, 1994. https://doi.org/10.1007/BF01420984
  14. J. Y. Bouguet, "Pyramidal Implementation of the Lucas Kanade Feature Tracker Description of the algorithm," Technical Report, Intel Microprocessor Research Labs, 1999.
  15. Z. Kalal, K. Mikolajczyk, and J. Matas, "Forward-Backward Error: Automatic Detection of Tracking Failures," 20th International Conference on Pattern Recognition, pp.23-26, 2010.
  16. G. Welch and G. Bishop, "An Introduction to the Kalman filter," Technical report, UNC-CH Computer Science Technical Report 95041, 1995.
  17. M.-H. Lee and C.-S. Hwang, "Mathematical Modeling of Moving Target and Development of Real-Time Tracking Method Using Kalman Filter," The Korean Institute of Electrical Engineers, pp.100-106, 1986.
  18. Q. Li, R. Li, K. Ji, and W. Dai, "Kalman Filter and Its Application," International Conference on Intelligent Networks and Intelligent Systems, pp.74-77, 2015.
  19. J. Shi and C. Tomasi, "Good features to track," Conference on Computer Vision and Pattern Recognition, 1994.
  20. L. Cehovin, A. Leonardis, and M. Kristan, "Visual Object Tracking Performance Measures Revisited," IEEE Transactions on Image Processing, Vol.25, pp.1261-1274, 2016. https://doi.org/10.1109/TIP.2016.2520370
  21. H. Grabner, M. Grabner, and H. Bischof, "Real-Time Tracking via On-line Boosting," In BMVC, Vol.1, No.5, 2006.
  22. B. Babenko, M.-H. Yang, and S. Belongie, "Visual Tracing with Online Multiple Instance Learning," IEEE Conference on Computer Vision and Pattern Recognition, pp.983-990, 2009.
  23. J. Davis and M. Goadrich, "The Relationship between Precision-Recall and ROC Curves," Proceedings of the 23rd International conference on Machine Learning. ACM, pp.233-240, 2006.