Browse > Article
http://dx.doi.org/10.5370/KIEE.2014.63.4.565

A Study on Implementing Kinect-Based Control for LCD Display Contents  

Rho, Jungkyu (Dept. of Computer Science, Seokyeong University)
Publication Information
The Transactions of The Korean Institute of Electrical Engineers / v.63, no.4, 2014 , pp. 565-569 More about this Journal
Abstract
Recently, various kinds of new computer controlled devices have been introduced in a wide range of areas, and convenient user interfaces for controlling the devices are strongly needed. To implement natural user interfaces(NUIs) on top of the devices, new technologies like a touch screen, Wii Remote, wearable interfaces, and Microsoft Kinect were presented. This paper presents a natural and intuitive gesture-based model for controlling contents of LCD display. Microsoft Kinect sensor and its SDK are used to recognize human gestures, and the gestures are interpreted into corresponding commands to be executed. A command dispatch model is also proposed in order to handle the commands more naturally. I expect the proposed interface can be used in various fields, including display contents control.
Keywords
Natural user interface; Gesture recognition; Depth sensor; Kinect; Display contents;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 Google Earth, http://www.google.com/earth/
2 W. Li, Z. Zhang, and Z. Liu, "Action Recognition Based on A Bag of 3D Points," Computer Vision and Pattern Recognition Workshops(CVPRW), pp. 9-14, 2010.
3 I. Oikonomidis, N. Kyriazis, and A.A. Argyros, "Efficient Model-based 3D Tracking of Hand Articulations using Kinect," Proc. of British Machine Vision Conf., pp. 101.1-101.11, 2011.
4 D. Wigdor and D. Wixon, Brave NUI World: Designing Natural User Interfaces for Touch and Gesture, Morgan Kaufmann Publishers, 2011.
5 J. Sung, C. Ponce, B. Selman, and A. Saxena, "Human Activity Detection from RGBD Images," AAAI 2011 Workshop, pp. 47-55, 2011.
6 P. Suryanarayan, A. Subramanian, and D. Mandalapu, "Dynamic Hand Pose Recognition using Depth Data," Int'l Conf. on Pattern Recognition, pp. 3105-3108, 2010.
7 Kinect for Windows, http://www.microsoft.com/en-us/kinectforwindows/
8 P. J. Bristeau, F. Callou, D. Vissiere, N. Petit, "The Navigation and Control Technology Inside the AR.Drone Micro UAV," IFAC World Congress, pp. 1477-1484, 2011.
9 P. Mistry and P. Maes, "SixthSense: A Wearable Gestural Interface," Proc. of SIGGRAPH Asia 2009 Sketches, Yokohama, Japan. 2009.
10 J. C. Lee, "Hacking the Nintendo Wii Remote," Pervasive Computing, IEEE, vol.7, issue 3, pp. 39-45, 2008.   DOI   ScienceOn
11 S.-H. Jang, J.-W. Yoon, and S.-B. Cho, "User Interfaces for Visual Telepresence in Human-Robot Interaction Using Wii Controller," Journal of the HCI Society of Korea, vol.3, no.1, pp. 27-32, 2008.   과학기술학회마을   DOI
12 A. Sanna, F. Lamberti, G. Paravati, and F. Manuri, "A Kinect-based Natural Interface for Quadrotor Control," Entertainment Computing, vol.4, issue 3, pp. 179-186, 2013.   DOI   ScienceOn
13 T. Osunkoya and J. C. Chern, "Gesture-based Human Computer Interaction using Kinect for Windows Mouse Control and Powerpoint Presentation," Proc. of Midwest Instruction and Computing Symposium, Wisconsin, USA, 2013.
14 K. T.-M. Tran and S.-H. Oh, "Hand Gesture Recognition for 3D-Heritage-Tourism using Microsoft Kinect Sensor," Advanced Science and Technology Letters, vol.30, pp. 145-148, 2013.
15 R. Munoz-Salinas, R. Medina-Carnicer, F.J. Madrid-Cuevas, and A. Carmona-Poyato, "Depth Silhouettes for Gesture Recognition," Pattern Recognition Letters, vol.29, no.3, pp. 319-329, 2008.   DOI   ScienceOn