Browse > Article
http://dx.doi.org/10.4218/etrij.13.0212.0170

Tracking and Interaction Based on Hybrid Sensing for Virtual Environments  

Jo, Dongsik (Creative Content Research Laboratory, ETRI)
Kim, Yongwan (Creative Content Research Laboratory, ETRI)
Cho, Eunji (Intelligent Media Lab., POSTECH)
Kim, Daehwan (Creative Content Research Laboratory, ETRI)
Kim, Ki-Hong (Creative Content Research Laboratory, ETRI)
Lee, Gil-Haeng (Creative Content Research Laboratory, ETRI)
Publication Information
ETRI Journal / v.35, no.2, 2013 , pp. 356-359 More about this Journal
Abstract
We present a method for tracking and interaction based on hybrid sensing for virtual environments. The proposed method is applied to motion tracking of whole areas, including the user's occlusion space, for a high-precision interaction. For real-time motion tracking surrounding a user, we estimate each joint position in the human body using a combination of a depth sensor and a wand-type physical user interface, which is necessary to convert gyroscope and acceleration values into positional data. Additionally, we construct virtual contents and evaluate the validity of results related to hybrid sensing-based whole-body tracking of human motion methods used to compensate for the occluded areas.
Keywords
Virtual reality; tracking; interaction; depth sensor; physical user interface (PUI); hybrid tracking;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 G.A. Lee et al., "Virtual Reality Content-Based Training for Spray Painting Tasks in the Shipbuilding Industry," ETRI J., vol. 32, no. 5, Oct. 2010, pp. 695-703.   DOI   ScienceOn
2 D.S. Jo, U.Y. Yang, and W.H. Son, "Design Evaluation System with Visualization and Interaction of Mobile Devices Based on Virtual Reality Prototypes," ETRI J., vol. 30, no. 63, Dec. 2008, pp. 757-764.   DOI   ScienceOn
3 V. Ganapathi et al., "Real Time Motion Capture Using a Single Time-of-Flight Camera," Proc. ICCV, 2010, pp. 755-762.
4 C. Wren et al., "Pfinder: Real-Time Tracking of the Human Body," IEEE Trans. Pattern Anal. Mach. Intell., vol. 19, no. 7, July 1997, pp. 780-785.   DOI   ScienceOn
5 Y. Liu et al., "Markerless Motion Capture of Interacting Characters Using Multi-view Image Segmentation," Proc. CVPR, 2011, pp. 1249-1256.
6 C. Plagemann et al., "Real-Time Identification and Localization of Body Parts from Depth Images," Proc. ICRA, 2010, pp. 3108-3113.
7 D.H. Kim and D.J. Kim, "Self-Occlusion Handling for Human Body Motion Tracking from 3D ToF Image Sequence," Proc. ACMMM 3DVP, 2010, pp. 57-62.
8 J. Shotton et al., "Real-Time Human Pose Recognition in Parts from Single Depth Images," Proc. CVPR, 2011, pp. 1297-1304.
9 K. Khoshelham, "Accuracy Analysis of Kinect Depth Data," Proc. ISPRS, 2011, pp. 29-31.
10 S.H. Kwak et al., "Learning Occlusion with Likelihoods for Visual Tracking," Proc. ICCV, 2011, pp. 1551-1558.
11 Y.-L. Chou, Statistical Analysis: With Business and Economic Applications, 2nd ed., New York: Holt, Rinehart & Winston of Canada Ltd, 1975, Section 17.9.
12 S.R. Buss and J.S. Kim, "Selectively Damped Least Squares for Inverse Kinematics," J. Graphics Tool, vol. 10, no. 3, 2005, pp. 37-49.