Browse > Article
http://dx.doi.org/10.15207/JKCS.2019.10.11.111

A Study on the Windows Application Control Model Based on Leap Motion  

Kim, Won (Division of IT Convergence, Woosong University)
Publication Information
Journal of the Korea Convergence Society / v.10, no.11, 2019 , pp. 111-116 More about this Journal
Abstract
With recent rapid development of computer capabilities, various technologies that can facilitate the interaction between humans and computers are being studied. The paradigm tends to change to NUI using the body such as 3D motion, haptics, and multi-touch with GUI using traditional input devices. Various studies have been conducted on transferring human movements to computers using sensors. In addition to the development of optical sensors that can acquire 3D objects, the range of applications in the industrial, medical, and user interface fields has been expanded. In this paper, I provide a model that can execute other programs through gestures instead of the mouse, which is the default input device, and control Windows based on the lip motion. To propose a model which converges with an Android application and can be controlled by various media and voice instruction functions using voice recognition and buttons through connection with a main client. It is expected that Internet media such as video and music can be controlled not only by a client computer but also by an application at a long distance and that convenient media viewing can be performed through the proposal model.
Keywords
Convergence; 3D Motion; Leap-Motion; Gesture; Media Control;
Citations & Related Records
Times Cited By KSCI : 6  (Citation Analysis)
연도 인용수 순위
1 J. S. Shin, K. R. Ko & S. B. Pan. (2014). Automation of Human Body Model Data Measurement Using Kinect in Motion Capture System. The Journal of Korean Institute of Information Technology, 12(9), 173-180. DOI : 10.14801/kitr.2014.12.9.173
2 M. J. Kim, J. M. Heo, J. H. Kim, S. Y. Park & J. N. Chang. (2014). Development and Evaluation of Leapmotion-based Game Interface Considering Intuitive Hand Gestures. Korean Society For Computer Game, 27(4), 69-75.
3 J. H. Nam, S. H. Yang, W. Hu & B. G. Kim. (2014). A new study on hand gesture recognition algorithm using leap motion system. Journal of Korea Multimedia Society, 17(11), 1263-1269. DOI : 10.9717/kmms.2014.17.11.1263   DOI
4 P. S. Shin, S. K. Kim & J. M. Kim. (2014). Intuitive Controller based on G-Sensor for Flying Drone. Journal of Digital Convergence, 12(1), 319-324. DOI : 10.14400/JDPM.2014.12.1.319   DOI
5 S. R. Jeong & S. J. Chang. (2019). Production of fusion-type realistic contents using 3D Motion control technology. Journal of Convergence for Information Technology, 9(4), 146-151. DOI : 10.22156/CS4SMB.2019.9.4.146   DOI
6 J. H. Park & K. J. Lee. (2017). Realization of user-centered smart factory system using motion recognition. Journal of Convergence for Information Technology, 7(6), 153-158. DOI : 10.22156/CS4SMB.2017.7.6.153   DOI
7 K. Khoshelham & S. O. Elberink. (2012). Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors, 12(2), 1437-1454. DOI : 10.3390/s120201437   DOI
8 K. K. Biswas & S. K. Basu. (2011). Gesture recognition using microsoft kinect(R). In The 5th international conference on automation, robotics and applications IEEE, 100-103. DOI: 10.1109/ICARA.2011.6144864   DOI
9 E. Chng. (2012). New ways of accessing information spaces using 3D multitouch tables. In 2012 International Conference on Cyberworlds. IEEE, 144-150. DOI: 10.1109/CW.2012.27
10 S. Mitra & T. Acharya. (2007). Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 37(3), 311-324. DOI: 10.1109/TSMCC.2007.893280   DOI
11 C. H. Morimoto & M. R. Mimica. (2005). Eye gaze tracking techniques for interactive applications. Computer vision and image understanding, 98(1), 4-24. DOI : 10.1016/j.cviu.2004.07.010   DOI
12 J. W. Shin, J. S. Kim, G. S. Hong & B. G. Kim. (2018). Development of Health Care System for Elderly People with Dementia Based on Leap Motion Sensor. Journal of Digital Contents Society, 19(2), 319-325. DOI : 10.9728/dcs.2018.19.2.319   DOI
13 M. A. Anusuya & S. K. Katti. (2011). Front end analysis of speech recognition: a review. International Journal of Speech Technology, 14(2), 99-145. DOI 10.1007/s10772-010-9088-7   DOI
14 L. C. Ebert, P. M. Flach, M. J. Thali & S. Ross. (2014). Out of touch-A plugin for controlling OsiriX with gestures using the leap controller. Journal of Forensic Radiology and Imaging, 2(3), 126-128. DOI : 10.1016/j.jofri.2014.05.006   DOI
15 K. S. Lee, S. H. Oh, K. H. Jeon, S. S. Kang, D. H. Ryu & B. G. Kim. (2012). A Study on Smart Touch Projector System Technology using Infrared (IR) Imaging Sensor. Journal of Korea Multimedia Society, 15(7), 870-878. DOI : 10.9717/kmms.2012.15.7.780   DOI
16 F. Weichert, D. Bachmann, B. Rudak & D. Fisseler. (2013). Analysis of the accuracy and robustness of the leap motion controller. Sensors, 13(5), 6380-6393. DOI : 10.3390/s130506380   DOI
17 M. S. An & D. S. Kang. (2012). Hand Gesture Recognition Algorithm for Immersive Interface. The Journal of Korean Institute of Information Technology, 10(3), 189-194.
18 B. Sodgerel, Y. K. Kim & M. H. Kim. (2015). 8-Straight Line Directions Recognition Algorithm for Hand Gestures Using Coordinate Information. Journal of Digital Convergence, 13(9), 259-267. DOI : 10.14400/JDC.2015.13.9.259   DOI