DOI QR코드

DOI QR Code

립모션 기반의 윈도우즈 애플리케이션 제어 모델에 관한 연구

A Study on the Windows Application Control Model Based on Leap Motion

  • 김원 (우송대학교 IT융합학부)
  • Kim, Won (Division of IT Convergence, Woosong University)
  • 투고 : 2019.10.10
  • 심사 : 2019.11.20
  • 발행 : 2019.11.28

초록

최근 컴퓨터 능력의 급속한 발전으로 인간과 컴퓨터간의 상호 작용을 편리하게 연결할 수 있는 많은 기술들이 연구되고 있는 상황으로, 전통적인 입력장치를 사용한 GUI에서 3D 모션, 햅틱, 멀티 터치와 같은 신체를 이용한 NUI로 패러다임이 변화되고 있는 추세이다. 인간의 동작을 센서를 이용하여 컴퓨터에 전달하는 많은 연구가 이루어지고 있으며, 3D 객체를 획득할 수 있는 광학 센서의 개발과 더불어 산업 및 의료 분야, 사용자 인터페이스 분야 등으로 응용 범위가 확장되고 있다. 본 논문에서는 립모션을 기반으로 사용자의 손동작에 따라 기본 입력장치인 마우스를 대신하여 제스처를 통한 타 프로그램 실행 및 윈도우즈 제어가 가능하며, 안드로이드 앱과 융합하여 메인 클라이언트와 연결을 통하여, 음성인식과 버튼을 사용해 각종 미디어와 음성 명령 기능을 통한 제어가 가능한 모델을 제안한다. 제안 모델을 통하여 영상, 음악과 같은 인터넷 미디어를 클라이언트 컴퓨터 뿐만 아니라 앱을 통한 원거리 제어가 가능하여, 편리하게 미디어를 시청할 수 있을 것으로 기대된다.

With recent rapid development of computer capabilities, various technologies that can facilitate the interaction between humans and computers are being studied. The paradigm tends to change to NUI using the body such as 3D motion, haptics, and multi-touch with GUI using traditional input devices. Various studies have been conducted on transferring human movements to computers using sensors. In addition to the development of optical sensors that can acquire 3D objects, the range of applications in the industrial, medical, and user interface fields has been expanded. In this paper, I provide a model that can execute other programs through gestures instead of the mouse, which is the default input device, and control Windows based on the lip motion. To propose a model which converges with an Android application and can be controlled by various media and voice instruction functions using voice recognition and buttons through connection with a main client. It is expected that Internet media such as video and music can be controlled not only by a client computer but also by an application at a long distance and that convenient media viewing can be performed through the proposal model.

키워드

참고문헌

  1. B. Sodgerel, Y. K. Kim & M. H. Kim. (2015). 8-Straight Line Directions Recognition Algorithm for Hand Gestures Using Coordinate Information. Journal of Digital Convergence, 13(9), 259-267. DOI : 10.14400/JDC.2015.13.9.259
  2. M. S. An & D. S. Kang. (2012). Hand Gesture Recognition Algorithm for Immersive Interface. The Journal of Korean Institute of Information Technology, 10(3), 189-194.
  3. J. S. Shin, K. R. Ko & S. B. Pan. (2014). Automation of Human Body Model Data Measurement Using Kinect in Motion Capture System. The Journal of Korean Institute of Information Technology, 12(9), 173-180. DOI : 10.14801/kitr.2014.12.9.173
  4. M. J. Kim, J. M. Heo, J. H. Kim, S. Y. Park & J. N. Chang. (2014). Development and Evaluation of Leapmotion-based Game Interface Considering Intuitive Hand Gestures. Korean Society For Computer Game, 27(4), 69-75.
  5. J. H. Nam, S. H. Yang, W. Hu & B. G. Kim. (2014). A new study on hand gesture recognition algorithm using leap motion system. Journal of Korea Multimedia Society, 17(11), 1263-1269. DOI : 10.9717/kmms.2014.17.11.1263
  6. P. S. Shin, S. K. Kim & J. M. Kim. (2014). Intuitive Controller based on G-Sensor for Flying Drone. Journal of Digital Convergence, 12(1), 319-324. DOI : 10.14400/JDPM.2014.12.1.319
  7. S. R. Jeong & S. J. Chang. (2019). Production of fusion-type realistic contents using 3D Motion control technology. Journal of Convergence for Information Technology, 9(4), 146-151. DOI : 10.22156/CS4SMB.2019.9.4.146
  8. J. H. Park & K. J. Lee. (2017). Realization of user-centered smart factory system using motion recognition. Journal of Convergence for Information Technology, 7(6), 153-158. DOI : 10.22156/CS4SMB.2017.7.6.153
  9. K. Khoshelham & S. O. Elberink. (2012). Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors, 12(2), 1437-1454. DOI : 10.3390/s120201437
  10. K. K. Biswas & S. K. Basu. (2011). Gesture recognition using microsoft kinect(R). In The 5th international conference on automation, robotics and applications IEEE, 100-103. DOI: 10.1109/ICARA.2011.6144864
  11. E. Chng. (2012). New ways of accessing information spaces using 3D multitouch tables. In 2012 International Conference on Cyberworlds. IEEE, 144-150. DOI: 10.1109/CW.2012.27
  12. S. Mitra & T. Acharya. (2007). Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 37(3), 311-324. DOI: 10.1109/TSMCC.2007.893280
  13. C. H. Morimoto & M. R. Mimica. (2005). Eye gaze tracking techniques for interactive applications. Computer vision and image understanding, 98(1), 4-24. DOI : 10.1016/j.cviu.2004.07.010
  14. M. A. Anusuya & S. K. Katti. (2011). Front end analysis of speech recognition: a review. International Journal of Speech Technology, 14(2), 99-145. DOI 10.1007/s10772-010-9088-7
  15. L. C. Ebert, P. M. Flach, M. J. Thali & S. Ross. (2014). Out of touch-A plugin for controlling OsiriX with gestures using the leap controller. Journal of Forensic Radiology and Imaging, 2(3), 126-128. DOI : 10.1016/j.jofri.2014.05.006
  16. K. S. Lee, S. H. Oh, K. H. Jeon, S. S. Kang, D. H. Ryu & B. G. Kim. (2012). A Study on Smart Touch Projector System Technology using Infrared (IR) Imaging Sensor. Journal of Korea Multimedia Society, 15(7), 870-878. DOI : 10.9717/kmms.2012.15.7.780
  17. J. W. Shin, J. S. Kim, G. S. Hong & B. G. Kim. (2018). Development of Health Care System for Elderly People with Dementia Based on Leap Motion Sensor. Journal of Digital Contents Society, 19(2), 319-325. DOI : 10.9728/dcs.2018.19.2.319
  18. F. Weichert, D. Bachmann, B. Rudak & D. Fisseler. (2013). Analysis of the accuracy and robustness of the leap motion controller. Sensors, 13(5), 6380-6393. DOI : 10.3390/s130506380