References
- B. Sodgerel, Y. K. Kim & M. H. Kim. (2015). 8-Straight Line Directions Recognition Algorithm for Hand Gestures Using Coordinate Information. Journal of Digital Convergence, 13(9), 259-267. DOI : 10.14400/JDC.2015.13.9.259
- M. S. An & D. S. Kang. (2012). Hand Gesture Recognition Algorithm for Immersive Interface. The Journal of Korean Institute of Information Technology, 10(3), 189-194.
- J. S. Shin, K. R. Ko & S. B. Pan. (2014). Automation of Human Body Model Data Measurement Using Kinect in Motion Capture System. The Journal of Korean Institute of Information Technology, 12(9), 173-180. DOI : 10.14801/kitr.2014.12.9.173
- M. J. Kim, J. M. Heo, J. H. Kim, S. Y. Park & J. N. Chang. (2014). Development and Evaluation of Leapmotion-based Game Interface Considering Intuitive Hand Gestures. Korean Society For Computer Game, 27(4), 69-75.
- J. H. Nam, S. H. Yang, W. Hu & B. G. Kim. (2014). A new study on hand gesture recognition algorithm using leap motion system. Journal of Korea Multimedia Society, 17(11), 1263-1269. DOI : 10.9717/kmms.2014.17.11.1263
- P. S. Shin, S. K. Kim & J. M. Kim. (2014). Intuitive Controller based on G-Sensor for Flying Drone. Journal of Digital Convergence, 12(1), 319-324. DOI : 10.14400/JDPM.2014.12.1.319
- S. R. Jeong & S. J. Chang. (2019). Production of fusion-type realistic contents using 3D Motion control technology. Journal of Convergence for Information Technology, 9(4), 146-151. DOI : 10.22156/CS4SMB.2019.9.4.146
- J. H. Park & K. J. Lee. (2017). Realization of user-centered smart factory system using motion recognition. Journal of Convergence for Information Technology, 7(6), 153-158. DOI : 10.22156/CS4SMB.2017.7.6.153
- K. Khoshelham & S. O. Elberink. (2012). Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors, 12(2), 1437-1454. DOI : 10.3390/s120201437
- K. K. Biswas & S. K. Basu. (2011). Gesture recognition using microsoft kinect(R). In The 5th international conference on automation, robotics and applications IEEE, 100-103. DOI: 10.1109/ICARA.2011.6144864
- E. Chng. (2012). New ways of accessing information spaces using 3D multitouch tables. In 2012 International Conference on Cyberworlds. IEEE, 144-150. DOI: 10.1109/CW.2012.27
- S. Mitra & T. Acharya. (2007). Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 37(3), 311-324. DOI: 10.1109/TSMCC.2007.893280
- C. H. Morimoto & M. R. Mimica. (2005). Eye gaze tracking techniques for interactive applications. Computer vision and image understanding, 98(1), 4-24. DOI : 10.1016/j.cviu.2004.07.010
- M. A. Anusuya & S. K. Katti. (2011). Front end analysis of speech recognition: a review. International Journal of Speech Technology, 14(2), 99-145. DOI 10.1007/s10772-010-9088-7
- L. C. Ebert, P. M. Flach, M. J. Thali & S. Ross. (2014). Out of touch-A plugin for controlling OsiriX with gestures using the leap controller. Journal of Forensic Radiology and Imaging, 2(3), 126-128. DOI : 10.1016/j.jofri.2014.05.006
- K. S. Lee, S. H. Oh, K. H. Jeon, S. S. Kang, D. H. Ryu & B. G. Kim. (2012). A Study on Smart Touch Projector System Technology using Infrared (IR) Imaging Sensor. Journal of Korea Multimedia Society, 15(7), 870-878. DOI : 10.9717/kmms.2012.15.7.780
- J. W. Shin, J. S. Kim, G. S. Hong & B. G. Kim. (2018). Development of Health Care System for Elderly People with Dementia Based on Leap Motion Sensor. Journal of Digital Contents Society, 19(2), 319-325. DOI : 10.9728/dcs.2018.19.2.319
- F. Weichert, D. Bachmann, B. Rudak & D. Fisseler. (2013). Analysis of the accuracy and robustness of the leap motion controller. Sensors, 13(5), 6380-6393. DOI : 10.3390/s130506380