Browse > Article

User Evaluation of Encountered Type Haptic System with Visual-Haptic Co-location  

Cha, Baekdong (광주과학기술원 기계공학부)
Bae, Yoosung (광주과학기술원 기계공학부)
Choi, Wonil (GIST대학 기초학부)
Ryu, Jeha (광주과학기술원 기계공학부)
Publication Information
Journal of the HCI Society of Korea / v.14, no.2, 2019 , pp. 13-20 More about this Journal
Abstract
For encountered haptic display systems among the virtual training systems for industrial safety, visual-haptic co-location is required for natural interaction between virtual and real objects. In this paper, we performed the user evaluation of the immersive VR haptic system which implement some level of visual-haptic co-location through a careful and accurate calibration method. The goal of the evaluation is to show that user performance (reaction time and distance accuracy) for both environments is not significantly different for certain tasks performed. The user evaluation results show statistically significant differences in reaction time but the absolute difference is less than 1 second. In the meantime, the distance accuracy shows no difference between the virtual and the actual environments. Therefore, it can be concluded that the developed haptic virtual training system can provide inexpensive industrial safety training in place of costly actual environment.
Keywords
VR; Virtual training simulations; Augmented Reality; Haptics; Visual-Haptic Collocation; Evaluation;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Loftin, R. B. and Kenney, P. Training the Hubble space telescope flight team. IEEE Computer Graphics and Applications. 15(5). IEEE. pp. 31-37. 1995.   DOI
2 Rose, F. D., Attree, E. A., Brooks, B. M. and Parslow, D. M. Training in virtual environments: transfer to real world tasks and equivalence to real task training. Ergonomics. 43(4). Taylor & Francis. pp. 494-511. 2000.   DOI
3 Klowden, R. J. D. and Hannaford, B. Virtual Training for a Manual Assembly Task. The electronic journal of haptics research. 2(2). Haptics-e. pp. 1-7. 2001.
4 McNeely, W. A. Robotic graphics: a new approach to force feedback for virtual reality. Proceedings of IEEE Virtual Reality Annual International Symposium. Seattle, WA. pp. 336-341. 1993.
5 Yokokohji, Y., Hollis, R. L. and Kanade, T. WYSIWYF display: A visual/haptic interface to virtual environment. Presence: Teleoperators and Virtual Environments. 8(4). MIT Press. pp. 412-434. 1999.   DOI
6 Akbar, N. and Jeon, S. Encountered-type haptic interface for grasping interaction with round variable size objects via pneumatic balloon. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Berlin Heidelberg. pp. 192-200. 2014.
7 Lee, C. G., Dunn, G. L., Oakley, I. and Ryu, J. Visual Guidance for Encountered Type Haptic Display: A feasibility study. IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct). Merida. pp. 74-77. 2016.
8 Cha, M. H. and Huh, Y. C. An application of haptic and locomotion interfaces in a virtual training environment. 44th International Symposium on Robotics (ISR). IEEE. pp. 1-2. 2013.
9 Bae, Y. S., Baek, S. Y., Kim, Y. H., Dunn, G., ElSharkawy, A. and Ryu, J. A Haptic Augmented Virtuality for Immersive VR Plant Training Based on Unity3D Endgine. International Conference on Artificial Reality and Telexistence Eurographics Symposium on Virtual Environments 2017. Adelaide, Austrialia. The Eurographics Association. poster B. 2017
10 Lee, C. G., Oakley, I., Kim, E. S. and Ryu, J. Impact of Visual-Haptic Spatial Discrepancy on Targeting Performance. IEEE Trans. on Systems, Man and Cybernetics, Systems. 46(8). IEEE. pp. 1098-1108. 2016.   DOI
11 Yokokohji, Y., Muramori, N., Sato, Y. and Yoshikawa, T. Designing an encountered-type haptic display for multiple fingertip contacts based on the observation of human grasping behavior. The International Journal of Robotics Research. 24(9). SAGE Publications. pp. 717-729. 2005.   DOI
12 Bae, Y. S., Cha, B. D., Choi, W. I. and Ryu, J. Preliminary Evaluation of Calibrating Haptic Augmented Virtuality Systems. 18th International Conference on Control, Automation and Systems (ICCAS 2018). PyeongChang, GangWon, Korea. 2018.
13 Bates, D., Maechler, M. and Bolker, B. Lme4: Linear mixed-effects models using S4 classes (R package version 0.999999-0). http://CRAN.R-project.org May 13. 2012.
14 R Development Core Team. R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. http://www.R-project.org September 5. 2011.
15 Milgram, P. and Kishino, F. A Taxonomy of Mixed Reality Visual Displays. IEICE TRANSACTIONS on Information and Systems. 77(12). The Institute of Electronics, Information and Communication Engineers. pp. 1321-1329. 1994.
16 Bhagat, K. K., Liou, W. K. and Chang, C. Y. A cost-effective interactive 3D virtual reality system applied to military live firing training. Virtual Reality. 20(2). Springer. pp. 127-140. 2016.   DOI
17 Unity Technologies. Unity3D. https://unity3d.com/kr September 5. 2018.
18 Universal Robots. Introducing The Ur10 Collaborative Industrial Robot. https://www.universal-robots.com/products/ur10-robot March 15. 2017.
19 Facebook Technologies. oculus rift. https://www.oculus.com February 4. 2017.
20 Delhaye, B., Hayward, V., Lefevre, P. and Thonnard, J-L. Texture-induced vibrations in the forearm during tactile exploration. Frontier in Behavioral Neuroscience. 6. Frontiers. pp. 1-10. 2012.   DOI
21 Pabon, S., Sotgiu, E. and Leonardi, R. A data-glove with vibro-tactile stimulators for virtual social interaction and rehabilitation. 10th Annual International Workshop on Presence. International Society for Presence Research. pp. 345-348. 2007.
22 HTC Corporation. Vive VR SYSTEM. https://www.vive.com/us/product/vive-virtual-reality-system October 1. 2018.