DOI QR코드

DOI QR Code

NUI/NUX of the Virtual Monitor Concept using the Concentration Indicator and the User's Physical Features

사용자의 신체적 특징과 뇌파 집중 지수를 이용한 가상 모니터 개념의 NUI/NUX

  • Received : 2015.08.17
  • Accepted : 2015.11.07
  • Published : 2015.12.31

Abstract

As growing interest in Human-Computer Interaction(HCI), research on HCI has been actively conducted. Also with that, research on Natural User Interface/Natural User eXperience(NUI/NUX) that uses user's gesture and voice has been actively conducted. In case of NUI/NUX, it needs recognition algorithm such as gesture recognition or voice recognition. However these recognition algorithms have weakness because their implementation is complex and a lot of time are needed in training because they have to go through steps including preprocessing, normalization, feature extraction. Recently, Kinect is launched by Microsoft as NUI/NUX development tool which attracts people's attention, and studies using Kinect has been conducted. The authors of this paper implemented hand-mouse interface with outstanding intuitiveness using the physical features of a user in a previous study. However, there are weaknesses such as unnatural movement of mouse and low accuracy of mouse functions. In this study, we designed and implemented a hand mouse interface which introduce a new concept called 'Virtual monitor' extracting user's physical features through Kinect in real-time. Virtual monitor means virtual space that can be controlled by hand mouse. It is possible that the coordinate on virtual monitor is accurately mapped onto the coordinate on real monitor. Hand-mouse interface based on virtual monitor concept maintains outstanding intuitiveness that is strength of the previous study and enhance accuracy of mouse functions. Further, we increased accuracy of the interface by recognizing user's unnecessary actions using his concentration indicator from his encephalogram(EEG) data. In order to evaluate intuitiveness and accuracy of the interface, we experimented it for 50 people from 10s to 50s. As the result of intuitiveness experiment, 84% of subjects learned how to use it within 1 minute. Also, as the result of accuracy experiment, accuracy of mouse functions (drag(80.4%), click(80%), double-click(76.7%)) is shown. The intuitiveness and accuracy of the proposed hand-mouse interface is checked through experiment, this is expected to be a good example of the interface for controlling the system by hand in the future.

Human-Computer Interaction(HCI)에 대한 관심도가 높이지면서, HCI에 대한 연구도 활발히 진행되고 있다. 이와 더불어 사용자의 몸짓이나 음성을 이용하는 Natural User Interface/Natural User eXperience(NUI/NUX)에 대한 연구도 활발히 진행되고 있다. NUI/NUX의 경우, 제스처 인식이나 음성 인식 등의 인식 알고리즘이 필요하다. 하지만 이러한 인식 알고리즘은 전처리, 정규화, 특징 추출과 같은 단계를 거쳐야하기 때문에 구현이 복잡하고, 트레이닝에 많은 시간을 투자해야 한다는 단점이 있다. 최근에는 NUI/NUX 개발 도구로 Microsoft 사의 Kinect가 개발되어 개발자와 일반인들에게 많은 관심을 받고 있고, 이를 이용한 다양한 연구가 진행 중에 있다. 본 저자들의 이전 연구에서도 사용자의 신체적 특징을 이용하여 뛰어난 직관성을 가진 핸드 마우스를 구현하였다. 하지만 마우스의 움직임이 부자연스럽고 정확도가 낮아 사용자가 사용하기 다소 어려웠다는 단점이 있다. 본 연구에서는 Kinect를 통해 사용자의 신체적 특징을 실시간으로 추출하고, 이를 이용해 가상 모니터라는 새로운 개념을 추가한 핸드 마우스 인터페이스를 설계하고 구현하였다. 가상 모니터는 사용자의 손으로 마우스를 제어할 수 있는 가상의 공간을 의미한다. 이를 통해 가상 모니터 상의 손의 좌표를 실제 모니터 상의 좌표로 정확하게 매핑(mapping)이 가능하다. 가상 모니터를 사용함으로써 이전 연구의 장점인 직관성을 유지하고, 단점인 정확도를 높일 수 있다. 추가적으로 뇌파 집중 지표를 이용해 사용자의 불필요한 행동을 인식하여 핸드 마우스 인터페이스의 정확도를 높였다. 제안하는 핸드 마우스의 직관성과 정확성을 평가하기 위하여 10대부터 50대까지 50명에게 실험을 하였다. 직관성 실험 결과로 84%가 1분 이내에 사용방법을 터득하였다. 또한 동일한 피실험자에게 일반적인 마우스 기능(드래그, 클릭, 더블클릭)에 대해 정확성 실험을 한 결과로 드래그 80.9%, 클릭 80%, 더블 클릭 76.7%의 정확성을 보였다. 실험 결과를 통해 제안하는 핸드 마우스 인터페이스의 직관성과 정확성을 확인하였으며, 미래에 손으로 시스템이나 소프트웨어를 제어하는 인터페이스의 좋은 예시가 될 것으로 기대된다.

Keywords

References

  1. W. Bland, T. Naughton, G. Vallee, and S. L. Scott, "Design and implementation of a menu based oscar command line interface," in High Performance Computing Systems and Applications, pp. 25-25, 2007. http://dx.doi.org/10.1109/hpcs.2007.14
  2. M. Park, "A study on the Research on an effective graphic interface design of the web environment for the people with disability-Focused on the people with hearing impairment," Master's Degree, The Department of Design, Sejong University, Seoul, 2005.
  3. M. N. K. Boulos, B. J. Blanchard, C. Walker, J. Montero, A. Tripathy, and R. Gutierrez-Osuna, "Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation," International journal of health geographics, vol. 10, no. 1, pp. 45, 2011. http://dx.doi.org/10.1186/1476-072x-10-45
  4. M. F. Shiratuddin and K. W. Wong, "Non-contact multi-hand gestures interaction techniques for architectural design in a virtual environment," in Information Technology and Multimedia (ICIM), pp. 1-6, 2011. http://dx.doi.org/10.1109/icimu.2011.6122761
  5. I. Chiang, J.-C. Tsai, and S.-T. Chen, "Using Xbox 360 kinect games on enhancing visual performance skills on institutionalized older adults with wheelchairs," in Digital Game and Intelligent Toy Enhanced Learning (DIGITEL), pp. 263-267, 2012. http://dx.doi.org/10.1109/digitel.2012.69
  6. M. Ahn, and S. Jun, "Brain-Computer Interface System of Principles and technology trends." The Korean Institute of Information Scientists and Engineers, vol. 29, no. 4, pp. 42-53, 2011.
  7. K. Lee, T. Lee, and S. Lee, "Motor Imagery Brain Signal Analysis for EEG-based Mouse Control," Korean Journal of Cognitive Science, vol. 21, no. 2, pp. 309-338, 2010. http://dx.doi.org/10.19066/cogsci.2010.21.2.004
  8. D. J. Sturman and D. Zeltzer, "A survey of glove-based input," Computer Graphics and Applications, IEEE, vol. 14, no. 1, pp. 30-39, 1994. http://dx.doi.org/10.1109/38.250916
  9. F.-S. Chen, C.-M. Fu, and C.-L. Huang, "Hand gesture recognition using a real-time tracking method and hidden Markov models," Image and vision computing, vol. 21, no. 8, pp. 745-758, 2003. http://dx.doi.org/10.1016/s0262-8856(03)00070-2
  10. J. Kim, "Bio-mimetic Recognition of Action Sequence using Unsupervised Learning," Journal of Internet Computing and Service, vol. 15, no. 4, pp. 9-20, 2014. http://dx.doi.org/10.7472/jksii.2014.15.4.09
  11. J. Kim, "BoF based Action Recognition using Spatio-Temporal 2D Descriptor," Journal of Internet Computing and Service, vol. 16, no. 3, pp. 21-32, 2015. http://dx.doi.org/10.7472/jksii.2015.16.3.21
  12. T. Baudel and M. Beaudouin-Lafon, "Charade: remote control of objects using free-hand gestures," Communications of the ACM, vol. 36, no. 7, pp. 28-35, 1993. http://dx.doi.org/10.1145/159544.159562
  13. A. A. Argyros and M. I. Lourakis, "Vision-based interpretation of hand gestures for remote control of a computer mouse," in Computer Vision in Human-Computer Interaction, ed: Springer, pp. 40-51. 2006. http://dx.doi.org/10.1007/11754336_5
  14. G. Lee, D. Shin, and D. Shin, "NUI/NUX framework based on intuitive hand motion," Journal of Internet Computing and Services, vol. 15, no. 3, pp. 11-19, 2014. http://dx.doi.org/10.7472/jksii.2014.15.3.11
  15. J. O. Lubar and J. F. Lubar, "Electroencephalographic biofeedback of SMR and beta for treatment of attention deficit disorders in a clinical setting," Biofeedback and self-regulation, vol. 9, no. 1, pp. 1-23, 1984. http://dx.doi.org/10.1007/bf00998842
  16. M. Sterman, "Sensorimotor EEG operant conditioning: Experimental and clinical effects," The Pavlovian Journal of Biological Science: Official Journal of the Pavlovian, vol. 12, no. 2, pp. 63-92, 1977. http://dx.doi.org/10.1177/155005940003100110
  17. R. Kjeldsen and J. Kender, "Toward the use of gesture in traditional user interfaces," in Automatic Face and Gesture Recognition, pp. 151-156, 1996. http://dx.doi.org/10.1109/afgr.1996.557257
  18. J. Lee, J. Lee, Y. Myeong, J. Lee, and H. Seong, "Development of Motion Recognition Hand-Mouse Using OpenCV," Telcommunications Research Institue Proceedings, Sangji Uinversity, vol. 7, no. 2, pp. 15-19, 2011.
  19. E. Sanchez-Nielsen, L. Anton-Canalis, and C. Guerra-Artal, "An autonomous and user-independent hand posture recognition system for vision-based interface tasks," in Current Topics in Artificial Intelligence, ed: Springer, pp. 113-122, 2006. http://dx.doi.org/10.1007/11881216_13
  20. D. Oh and K. Hong, "Studies on the quantification of relaxation numbers using EEG," HCI 2014, pp. 853-856, 2014.
  21. S. Kang, C. Kim, and W. Son, "Developing User-friendly Hand Mouse Interface via Gesture Recognition," The Korean Society of Broadcast Engineers, pp. 129-132, 2009.