DOI QR코드

DOI QR Code

Robust Multithreaded Object Tracker through Occlusions for Spatial Augmented Reality

  • Lee, Ahyun (Hyper-connected Communication Research Laboratory, ETRI) ;
  • Jang, Insung (Hyper-connected Communication Research Laboratory, ETRI)
  • 투고 : 2017.08.07
  • 심사 : 2017.11.23
  • 발행 : 2018.04.01

초록

A spatial augmented reality (SAR) system enables a virtual image to be projected onto the surface of a real-world object and the user to intuitively control the image using a tangible interface. However, occlusions frequently occur, such as a sudden change in the lighting environment or the generation of obstacles. We propose a robust object tracker based on a multithreaded system, which can track an object robustly through occlusions. Our multithreaded tracker is divided into two threads: the detection thread detects distinctive features in a frame-to-frame manner, and the tracking thread tracks features periodically using an optical-flow-based tracking method. Consequently, although the speed of the detection thread is considerably slow, we achieve real-time performance owing to the multithreaded configuration. Moreover, the proposed outlier filtering automatically updates a random sample consensus distance threshold for eliminating outliers according to environmental changes. Experimental results show that our approach tracks an object robustly in real-time in an SAR environment where there are frequent occlusions occurring from augmented projection images.

키워드

참고문헌

  1. H. Benko, R. Jota, and A. Wilson, "MirageTable: Freehand Interaction on a Projected Augmented Reality Tabletop," in Proc. SIGCHI Conf. Human Factors Comput. Syst., Austin, TX, USA, May 2012, pp. 199-208.
  2. J.H. Lee et al., "FRC Based Augment Reality for Aiding Cooperative Activities," in 2013 IEEE RO-MAN, Gyeongju, Rep. of Korea, Aug. 2013, pp. 294-295.
  3. R. Ziola, S. Grampurohit, N. Landes, J. Fogarty, and B. Harrison, "Examining Interaction with General-Purpose Object Recognition in LEGO OASIS," in 2011 IEEE Symp. Vis. Lang. Human-Centric Comput. (VL/HCC), Pittsburgh, PA, USA, Sept. 2011, pp. 65-68.
  4. A. Lee, J.D. Suh, and J. Lee, "Interactive Design of Planar Curves Based on Spatial Augmented Reality," in Proc. Companion Publication Int. Conf. Intell. User Interfaces Companion, Santa Monica, CA, USA, Mar. 2013, pp. 53-54.
  5. D. Lowe, "Distinctive Image Features from Scale-Invariant Keypoints," Int. J. Comput. Vis., vol. 60, no. 2, 2004, pp. 91-110. https://doi.org/10.1023/B:VISI.0000029664.99615.94
  6. H. Bay, T. Tuytelaars, and L. Van Gool, "Speeded-Up Robust Features (SURF)," Eur. Conf. Comput. Vis., vol. 3951, 2006, pp. 404-417.
  7. E. Rublee, T. Tuytelaars, and L. Van Gool, "ORB: An Efficient Alternative to SIFT or SURF," 2011 IEEE Int. Conf. Comput. Vis. (ICCV), Barcelona, Spain, Nov. 2011, pp. 2564-2571.
  8. T. Lee and T. Hollerer, "Multithreaded Hybrid Feature Tracking for Markerless Augmented Reality," IEEE Trans. Vis. Comput. Graph., vol. 15, no. 3, 2009, pp. 355-368. https://doi.org/10.1109/TVCG.2008.190
  9. A. Davison, I.D. Reid, N.D. Molton, and O. Stasse, "MonoSLAM: Real-Time Single Camera SLAM," IEEE Trans. Pattern Anal. Mach. Intell., vol. 29, no. 6, 2007, pp. 1052-1067. https://doi.org/10.1109/TPAMI.2007.1049
  10. A. Lee, J.H. Lee, and J. Kim, "Data-Driven Kinematic Control for Robotic Spatial Augmented Reality System with Loose Kinematic Specifications," ETRI J., vol. 38, no. 2, Apr. 2016, pp. 337-346. https://doi.org/10.4218/etrij.16.0115.0610
  11. X. Wang, Z. Yao, and Z. Yang, "The Use of Object Tracking in Visual SLAM," in IEEE Int. Conf. Appl. Syst. Innovation (ICASI), Sapporo, Japan, May 2017, pp. 850- 853.
  12. M.A. Fischler and C.B. Robert, "Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography," Commun. ACM, vol. 24, no. 6, 1981, pp. 381-395. https://doi.org/10.1145/358669.358692
  13. R. Mur-Artal, J.M.M. Montiel, and J.D. Tardos, "ORB- SLAM: A Versatile and Accurate Monocular SLAM System," IEEE Trans. Robot., vol. 31, no. 5, 2015, pp. 1147-1163. https://doi.org/10.1109/TRO.2015.2463671
  14. L. Cheng, M. Li, Y. Liu, W. Cai, Y. Chen, and K. Yang, "Remote Sensing Image Matching by Integrating Affine Invariant Feature Extraction and RANSAC," Comput. Electr. Eng., vol. 38, no. 4, 2012, pp. 1023-1032. https://doi.org/10.1016/j.compeleceng.2012.03.003
  15. A. Lee and J.-H. Lee, "Multi-threaded Tracker with Outlier Filtering for Spatial Augmented Reality," in Int. Tech. Conf. Circuits Syst., Comput. Commun. (ITC-CSCC), Seoul, Rep. of Korea, July 2015, pp. 494-495.
  16. J.-H. Lee et al., "Calibration Issues in FRC: Camera, Projector, Kinematics Based Hybrid Approach," in Proc. Ubiquitous Robots Ambient Intell., Daejeon, Rep. of Korea, Nov. 2012, pp. 218-219.
  17. J.-H. Lee, "An Analytic Solution to Projector Pose Estimation Problem," ETRI J., vol. 34, no. 6, Dec. 2012, pp. 978-981. https://doi.org/10.4218/etrij.12.0212.0089
  18. Z. Zhang, "A Flexible New Technique for Camera Calibration," IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, no. 11, Nov. 2000, pp. 1330-1334. https://doi.org/10.1109/34.888718
  19. J. Weng, P. Cohen, and M. Herniou, "Camera Calibration with Distortion Models and Accuracy Evaluation," IEEE Trans. Pattern Anal. Mach. Intell., vol. 14, no. 10, 1992, pp. 965-980. https://doi.org/10.1109/34.159901
  20. R. Tsai, "A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off- the-Shelf TV Cameras and Lenses," IEEE J. Robot. Autom., vol. 3, no. 4, 1987, pp. 323-344. https://doi.org/10.1109/JRA.1987.1087109
  21. J.-Y. Bouguet, "Pyramidal Implementation of the Affine Lucas Kanade Feature Tracker Description of the Algorithm," Intel Corp., vol. 5, 2001, pp. 1-10.
  22. J.-W. Choi, D. Moon, and H.H. Yoo, "Robust Multi-person Tracking for Real-Time Intelligent Video Surveillance," ETRI J., vol. 37, no. 3, June 2015, pp. 551-561. https://doi.org/10.4218/etrij.15.0114.0629
  23. J.M. Prewitt and M.L. Mendelsohn, "The Analysis of Cell Images," Annu. NY Acad. Sci, vol. 128, no. 3, 1966, pp. 1035-1053.
  24. M. Fornasier and H. Rauhut, "Iterative Thresholding Algorithms," Appl. Comput. Harmon. Anal., vol. 25, no. 2, 2008, pp. 187-208. https://doi.org/10.1016/j.acha.2007.10.005
  25. W.H. Tsai, "Moment-Preserving Thresholding: A New Approach," Comput. Vis. Graph. Image Process., vol. 29, no. 3, 1985, pp. 377-393. https://doi.org/10.1016/0734-189X(85)90133-1
  26. W. Doyle, "Operations Useful for Similarity-Invariant Pattern Recognition," J. ACM, vol. 9, no. 2, 1962, pp. 259- 267. https://doi.org/10.1145/321119.321123
  27. N. Otsu, "A Threshold Selection Method from Gray-Level Histograms," IEEE Trans. Sys. Man. Cybern., vol. 9, 1975, pp. 62-66.
  28. J.H. Cha, Y.S. Jeon, Y.S. Moon, and S.H. Lee, "Seamless and Fast Panoramic Image Stitching," in IEEE Int. Conf. Consumer Electron. (ICCE), Las Vegas, NV, USA, Jan. 2012, pp. 29-30.
  29. F. Zhang and F. Liu, "Parallax-Tolerant Image Stitching," in Proc. IEEE Conf. Comput. Vis. Pattern Recogn., Columbus, OH, USA, June 2014, pp. 3262-3269.
  30. Y. Xiong and K. Pulli, "Sequential Image Stitching for Nobile Panoramas," in IEEE Int. Conf. Inf., Commun. Signal Process. (ICICS), Macau, China, Dec. 2009.
  31. M. Brown and D.G. Lowe, "Automatic Panoramic Image Stitching Using Invariant Features," Int. J. Comput. Vis., vol. 74, no. 1, 2007, pp. 59-73. https://doi.org/10.1007/s11263-006-0002-3

피인용 문헌

  1. Augmented Reality for Robotics: A Review vol.9, pp.2, 2018, https://doi.org/10.3390/robotics9020021
  2. Investigating the impact of economic, political, and social factors on augmented reality technology acceptance in agriculture (livestock farming) sector in a developing country vol.67, pp.None, 2021, https://doi.org/10.1016/j.techsoc.2021.101739