DOI QR코드

DOI QR Code

Motion Field Estimation Using U-Disparity Map in Vehicle Environment

  • Seo, Seung-Woo (Dept. of Electronic Engineering, Kwangwoon University) ;
  • Lee, Gyu-Cheol (Dept. of Electronic Engineering, Kwangwoon University) ;
  • Yoo, Ji-Sang (Dept. of Electronic Engineering, Kwangwoon University)
  • Received : 2016.01.27
  • Accepted : 2016.10.02
  • Published : 2017.01.02

Abstract

In this paper, we propose a novel motion field estimation algorithm for which a U-disparity map and forward-and-backward error removal are applied in a vehicular environment. Generally, a motion exists in an image obtained by a camera attached to a vehicle by vehicle movement; however, the obtained motion vector is inaccurate because of the surrounding environmental factors such as the illumination changes and vehicles shaking. It is, therefore, difficult to extract an accurate motion vector, especially on the road surface, due to the similarity of the adjacent-pixel values; therefore, the proposed algorithm first removes the road surface region in the obtained image by using a U-disparity map, and uses then the optical flow that represents the motion vector of the object in the remaining part of the image. The algorithm also uses a forward-backward error-removal technique to improve the motion-vector accuracy and a vehicle's movement is predicted through the application of the RANSAC (RANdom SAmple Consensus) to the previously obtained motion vectors, resulting in the generation of a motion field. Through experiment results, we show that the performance of the proposed algorithm is superior to that of an existing algorithm.

Keywords

References

  1. K. Yamaguchi, T. Kato and Y. Ninomiya, "Vehicle ego-motion estimation and moving object detection using a monocular camera", in Proc. Int. Conf. Pattern Recognition, Hong Kong, pp. 610-613, 2006
  2. O. Pink, F. Moosmann and A. Bachmann, "visual features for vehicle localization and ego-motion estimation", in Proc. IEEE Intell. Veh. Symp, Xi'an, pp. 254-260, 2009
  3. G. Ligorio and A. M. Sabatini, "Extended kalman filter-based methods for pose estimation using visual, inertial and magnetic sensors: comparative analysis and performance evaluation", Sensors, vol. 13, no. 2, pp. 1919-1941, Feb. 2013 https://doi.org/10.3390/s130201919
  4. F. J. Sharifi and M. Marey, "A kalman-filter-based method for pose estimation in visual servoing", IEEE Trans. Robotics, vol. 26, no. 5, pp. 939-947, Oct. 2010 https://doi.org/10.1109/TRO.2010.2061290
  5. M. A. Flschier and R. C. Bolles, "Random sample consensus: a paradigm for model fitting with application to image analysis and automated cartography", Commun. ACM, vol. 24, no. 6, pp. 381-395, Jun. 1981 https://doi.org/10.1145/358669.358692
  6. V. Lippiello, B. Siciliano and L. Villani, "Adaptive extended kalman filtering for visual motion estimation of 3D objects", Control Eng. Practice, vol. 15, no. 1, pp.123-134, Jan. 2007 https://doi.org/10.1016/j.conengprac.2006.05.006
  7. W. Jang, C Lee and Y. Ho, "Efficient depth map generation for various stereo camera arrangements", J. KICS, vol. 37, no. 6, pp. 458-463, Jun. 2012 https://doi.org/10.7840/KICS.2012.37.6A.458
  8. E. Baek and Y. Ho, "Stereo image composition using poisson object editing", J. KICS, vol. 39, no. 8, pp. 453-458, Aug. 2014
  9. Z. Hu and K. Uchimura, "U_V Disparity, an efficient algorithm for stereo vision based scene analysis", IEEE Intell. Veh. Symp., Las Vegas, pp. 48-54, 2005
  10. B. D. Lucas and T. Kanade, "An iterative image registration technique with an application to stereo vision", in Proc. Int. Joint Conf. Artificial Intell., Vancouver, pp. 674-679, 1981
  11. C. Song and J. Lee, "Detection of illegal u-turn vehicles by optical flow analysis", J. KICS, vol. 39, no. 10, pp. 948-956, Oct. 2014
  12. Z. Kalal, K. Mikolajczyk and J. Matas, "Forwardbackward error: automatic detection of tracking failures", in Proc. Int. Conf. Pattern Recognition, Istanbul, pp. 23-26, 2010
  13. C. Harris and M. Stephens, "A combined corner and edge detector", in Proc. Alvey Vision Conf., pp. 147- 151, Manchester, 1988
  14. B. K. P. Horn and B. Schunck, "Determining optical flow", Artificial Intell., vol. 17, no. 1-3, pp. 185-203, Aug. 1981 https://doi.org/10.1016/0004-3702(81)90024-2
  15. H. C. Longuet-Higgins and K. Prazdny, "The interpretation of a moving retinal image", The Royal Soc. London B, vol. 208, no. 1173, pp. 385-397, Jul. 1980 https://doi.org/10.1098/rspb.1980.0057
  16. C. Keller, M. Enzweiler and D. M. Gavila, "A new benchmark for stereo-based pedestrian detection", in Proc. IEEE Intell. Veh. Symp., Baden-Baden, 2011
  17. S. Seo, G. Lee and S. Lee, "Motion field estimation using u-disparity map and forward-backward error removal in vehicle environment", J. KICS, vol. 40, no. 12, pp. 2343-2352, Dec. 2015 https://doi.org/10.7840/kics.2015.40.12.2343

Cited by

  1. Moving Object Detection Using an Object Motion Reflection Model of Motion Vectors vol.11, pp.1, 2019, https://doi.org/10.3390/sym11010034