DOI QR코드

DOI QR Code

Study on Gesture and Voice-based Interaction in Perspective of a Presentation Support Tool

  • Ha, Sang-Ho (The School of Design and Human Engineering, Ulsan National Institute of Science and Technology (UNIST)) ;
  • Park, So-Young (The School of Design and Human Engineering, Ulsan National Institute of Science and Technology (UNIST)) ;
  • Hong, Hye-Soo (The School of Design and Human Engineering, Ulsan National Institute of Science and Technology (UNIST)) ;
  • Kim, Nam-Hun (The School of Design and Human Engineering, Ulsan National Institute of Science and Technology (UNIST))
  • Received : 2012.07.23
  • Accepted : 2012.07.31
  • Published : 2012.08.31

Abstract

Objective: This study aims to implement a non-contact gesture-based interface for presentation purposes and to analyze the effect of the proposed interface as information transfer assisted device. Background: Recently, research on control device using gesture recognition or speech recognition is being conducted with rapid technological growth in UI/UX area and appearance of smart service products which requires a new human-machine interface. However, few quantitative researches on practical effects of the new interface type have been done relatively, while activities on system implementation are very popular. Method: The system presented in this study is implemented with KINECT$^{(R)}$ sensor offered by Microsoft Corporation. To investigate whether the proposed system is effective as a presentation support tool or not, we conduct experiments by giving several lectures to 40 participants in both a traditional lecture room(keyboard-based presentation control) and a non-contact gesture-based lecture room(KINECT-based presentation control), evaluating their interests and immersion based on contents of the lecture and lecturing methods, and analyzing their understanding about contents of the lecture. Result: We check that whether the gesture-based presentation system can play effective role as presentation supporting tools or not depending on the level of difficulty of contents using ANOVA. Conclusion: We check that a non-contact gesture-based interface is a meaningful tool as a sportive device when delivering easy and simple information. However, the effect can vary with the contents and the level of difficulty of information provided. Application: The results presented in this paper might help to design a new human-machine(computer) interface for communication support tools.

Keywords

References

  1. Chae, S. J., The Importance of Nonverbal Communication Skills, Korean Journal of Medical Education, 22(2), 149-150, 2010. https://doi.org/10.3946/kjme.2010.22.2.149
  2. Channel 9, http://channel9.msdn.com/coding4fun/kinect (retrieved July 10, 2012)
  3. Hong, D. and Woo, W., Recent research trend of gesture-based user interfaces, Telecommunications review, 18(3), 403-413, 2008.
  4. Lee, Y. G., Introduction to speech-based interface technology and its service trends in smartphone applications, Korean Journal of Information and Communication Engineering: Information and Communication, 29(4), 3-9, 2012.
  5. Leyvand, T, Meekhof, C., Wei, Y. C., Sun, J. and Guo, B., Kinect Identity: Technology and Experience. Computer, 44(4), 94-96, 2011.
  6. Lim, J. R. and Kim, Y. S., The Communicative Interaction between the Body Languages and Everyday Verbal Expressions, The Journal of Linguistic Science, 17, 59-78, 2000.
  7. Kim, K. H., Smart TV gets unlimited connection with devices, Korea Report in SBS-CNBC, http://sbscnbc.sbs.co.kr/read.jsp?pmArticleId=10000410475
  8. Suma, E. A., "FAAST: The Flexible Action and Articulated Skeleton Toolkit", IEEE Virtual Reality Conference(pp. 247-248), Singapore, 2011.
  9. Tang, M., Hand Gesture Recognition Using Microsoft's Kinect. Paper written for CS229, March 16, 2011.
  10. Williams, E., Experimental Comparisons of Face-to-Face and Mediated Communication: A Review, Psychological Bulletin, 84(5), 963-976, 1977. https://doi.org/10.1037/0033-2909.84.5.963
  11. Xia, L., Chen, C. C. and Aggarwal, J. K., "Human Detection Using Depth Information by Kinect. Computer Vision and Pattern Recognition Workshops (CVPRW)", IEEE Computer Society Conferenc (pp. 15 -22), 2011.