DOI QR코드

DOI QR Code

Background and Local Histogram-Based Object Tracking Approach

도로 상황인식을 위한 배경 및 로컬히스토그램 기반 객체 추적 기법

  • Received : 2013.05.01
  • Accepted : 2013.06.22
  • Published : 2013.06.30

Abstract

Compared with traditional video monitoring systems that provide a video-recording function as a main service, an intelligent video monitoring system is capable of extracting/tracking objects and detecting events such as car accidents, traffic congestion, pedestrian detection, and so on. Thus, the object tracking is an essential function for various intelligent video monitoring and surveillance systems. In this paper, we propose a background and local histogram-based object tracking approach for intelligent video monitoring systems. For robust object tracking in a live situation, the result of optical flow and local histogram verification are combined with the result of background subtraction. In the proposed approach, local histogram verification allows the system to track target objects more reliably when the local histogram of LK position is not similar to the previous histogram. Experimental results are provided to show the proposed tracking algorithm is robust in object occlusion and scale change situation.

도로에서 발생되는 차량간 충돌사고, 교통 소통 상황, 보행자 사고 등 다양한 도로 상황을 모니터링 및 자동으로 인식하여 교통정보를 제공하거나 긴급구난 서비스를 제공하기 위한 다양한 기술이 개발되고 있다. 도로 모니터링을 통한 다양한 객체 추적 및 상황인식을 위해서는 잡음 및 겹침 등에 강인한 객체 추적 기술이 요구된다. 본 논문에서는 외부 환경에서 Background Subtraction, LK-Optical Flow, 지역 기반 히스토그램 특징의 결합을 통해 추적을 위한 몇 가지 추정 인자를 생성하고 이를 통해 변화가 있는 객체, 잡음에도 비교적 강인한 추적 방법을 제안한다. 구체적으로는 객체의 초기 움직임 정보를 검출하기 위해 옵티컬 플로우를 적용하여 컬러 정보 및 밝기 변화에 무관한 이동 정보를 측정한다. 측정된 정보를 기반으로 하여 지역 히스토그램 기반 검증을 통해 신뢰도를 판단한다. 신뢰도가 낮을 경우 배경 제거 정보와 지역 히스토그램 트래커의 정보를 혼합하여 새로운 위치를 추정한다. 실험을 통해 제안된 기법이 객체를 추적하고 있는 도중 나타날 수 있는 충돌, 새로운 특징의 등장, 크기 변화 상황에 강인하게 동작함을 제시한다.

Keywords

References

  1. Ali, A; Aggarwal, J. 2001, Segmentation and recognition of continuous human activity, Paper presented at the IEEE workshop on Detection and Recognition of Events in Video, July 28-35.
  2. Bhattacharyya, A. 1943, On a measure of divergence between two statistical populations defined by probability distributions, Bulletin of the Calcutta Mathematical Society, 35:99-109.
  3. Feng, J. F; Zhu, G. Y; Liu, Z. H; Li, Y. 2009, Research of Vehicle Navigation Bsed Video-GIS, Journal of Spatial Information Society, 11(2):39-44.
  4. Han, J. H. 2009, Video Surveillance and Analysis Algorithms: Technologies and Trends, Journal of the Koreas Institute of Electronics Engineers, 36(10):18-29.
  5. Haralick, R; Shanmugam, B; Dinstein, I. 1973, Textural features for image classification. IEEE trans. Syst. Man & Cyber., 33(3):610-622.
  6. Horn, B.K.P; Schunck, B.G. 1981, Determining optical flow, Artificial Intelligence, 17:185-203. https://doi.org/10.1016/0004-3702(81)90024-2
  7. Kullback, S; Burnham, K. P. 1987, Letter to the Editor: The Kullback-Leibler distance, The American Statistician 41(4):340-341.
  8. Levina, E; Bickel, P. 2001, The Earth Mover's Distance is the Mallows Distance: Some Insights from Statistics, Paper presented at ICCV, July 251-256.
  9. Lucas, B. D; Kanade, T. 1981, An iterative image registration technique with an application to stereo vision, Paper presented at the Imaging Understanding Workshop, April 121-130.
  10. Ning, J; Zhang, L; Zhang, D; Wu, C. 2012, Robust mean-shift tracking with corrected background-weighted histogram, Computer Vision, IET, 6(1):62-69. https://doi.org/10.1049/iet-cvi.2009.0075
  11. Oliver, N; Rosario, B; Pentland, A. 2000, A bayesian computer vision system for modeling human interactions. IEEE trans. on PAMI, 22(8):831-843. https://doi.org/10.1109/34.868684
  12. Paragios, N; Deriche, R. 2002, Geodesic active regions and level set methods for supervised texture segmentation, International Journal of Computer Vision 46(3):223-247. https://doi.org/10.1023/A:1014080923068
  13. Stauffer, C; Grimson, W. 1999, Adaptive background mixture models for real-time tracking, Paper presented at CVPR, June 246-252.
  14. Veenman, C; Reinders, M; Backer, E. 2001, Resolving motion correspondence for densely moving points, IEEE Trans. PAMI. 23(1):54-72. https://doi.org/10.1109/34.899946
  15. Wren, C. R; Azarbayejani, A; Drarrell, T; Pentland, A. P. 1997, Pfinder; Realtime tracking of the human body, IEEE trans. on PAMI, 19(7):780-785. https://doi.org/10.1109/34.598236
  16. Wu, Y; Cheng, J; Wang, J; Lu, H; Wang, J; Ling, H; Blasch, E; Bai, L. 2012, Real-Time Probabilistic Covariance Tracking With Efficient Model Update, IEEE Trans. Image Processing, 21(5):2824-2837. https://doi.org/10.1109/TIP.2011.2182521
  17. Yilmaz, A; Li, X; Shah, M. 2004, Contour based object tracking with occlusion handling in video acquired using mobile cameras, IEEE Trans. PAMI, 26(11):1531-1536. https://doi.org/10.1109/TPAMI.2004.96
  18. Yoon, C. L; Kim, H. C; Kim, K. O. 2009, 3-D GIS-based Real-time Video Visualization Technology, Journal of Spatial Information Society, 11(1):63-70.
  19. Zhu, S; Yuille, A. 1996, Region competition: unifying snakes, region growing, and bayes/mdl for multiband image segmentation, IEEE Trans. PAMI. 18(9):884-900. https://doi.org/10.1109/34.537343
  20. Zivkovic, Z; Heijden, F. 2006, Efficient adaptive density estimation per image pixel for the task of background subtraction, Pattern Recognition Letters, 27(7):773-780. https://doi.org/10.1016/j.patrec.2005.11.005
  21. Zoidi, O; Tefas, A; Pitas, I. 2013, Visual Object Tracking Based on Local Steering Kernels and Color Histograms, IEEE Transactions on Circuits and Systems for Video Technology, 23(5):870-882. https://doi.org/10.1109/TCSVT.2012.2226527