Browse > Article
http://dx.doi.org/10.7472/jksii.2015.16.6.11

NUI/NUX of the Virtual Monitor Concept using the Concentration Indicator and the User's Physical Features  

Jeon, Chang-hyun (Department of Computer Engineering, Sejong University)
Ahn, So-young (Department of Computer Engineering, Sejong University)
Shin, Dong-il (Department of Computer Engineering, Sejong University)
Shin, Dong-kyoo (Department of Computer Engineering, Sejong University)
Publication Information
Journal of Internet Computing and Services / v.16, no.6, 2015 , pp. 11-21 More about this Journal
Abstract
As growing interest in Human-Computer Interaction(HCI), research on HCI has been actively conducted. Also with that, research on Natural User Interface/Natural User eXperience(NUI/NUX) that uses user's gesture and voice has been actively conducted. In case of NUI/NUX, it needs recognition algorithm such as gesture recognition or voice recognition. However these recognition algorithms have weakness because their implementation is complex and a lot of time are needed in training because they have to go through steps including preprocessing, normalization, feature extraction. Recently, Kinect is launched by Microsoft as NUI/NUX development tool which attracts people's attention, and studies using Kinect has been conducted. The authors of this paper implemented hand-mouse interface with outstanding intuitiveness using the physical features of a user in a previous study. However, there are weaknesses such as unnatural movement of mouse and low accuracy of mouse functions. In this study, we designed and implemented a hand mouse interface which introduce a new concept called 'Virtual monitor' extracting user's physical features through Kinect in real-time. Virtual monitor means virtual space that can be controlled by hand mouse. It is possible that the coordinate on virtual monitor is accurately mapped onto the coordinate on real monitor. Hand-mouse interface based on virtual monitor concept maintains outstanding intuitiveness that is strength of the previous study and enhance accuracy of mouse functions. Further, we increased accuracy of the interface by recognizing user's unnecessary actions using his concentration indicator from his encephalogram(EEG) data. In order to evaluate intuitiveness and accuracy of the interface, we experimented it for 50 people from 10s to 50s. As the result of intuitiveness experiment, 84% of subjects learned how to use it within 1 minute. Also, as the result of accuracy experiment, accuracy of mouse functions (drag(80.4%), click(80%), double-click(76.7%)) is shown. The intuitiveness and accuracy of the proposed hand-mouse interface is checked through experiment, this is expected to be a good example of the interface for controlling the system by hand in the future.
Keywords
NUI/NUX; Hand mouse; Virtual Monitor; EEG; Concentration indicator;
Citations & Related Records
Times Cited By KSCI : 4  (Citation Analysis)
연도 인용수 순위
1 W. Bland, T. Naughton, G. Vallee, and S. L. Scott, "Design and implementation of a menu based oscar command line interface," in High Performance Computing Systems and Applications, pp. 25-25, 2007. http://dx.doi.org/10.1109/hpcs.2007.14
2 M. Park, "A study on the Research on an effective graphic interface design of the web environment for the people with disability-Focused on the people with hearing impairment," Master's Degree, The Department of Design, Sejong University, Seoul, 2005.
3 M. N. K. Boulos, B. J. Blanchard, C. Walker, J. Montero, A. Tripathy, and R. Gutierrez-Osuna, "Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation," International journal of health geographics, vol. 10, no. 1, pp. 45, 2011. http://dx.doi.org/10.1186/1476-072x-10-45   DOI
4 M. F. Shiratuddin and K. W. Wong, "Non-contact multi-hand gestures interaction techniques for architectural design in a virtual environment," in Information Technology and Multimedia (ICIM), pp. 1-6, 2011. http://dx.doi.org/10.1109/icimu.2011.6122761   DOI
5 I. Chiang, J.-C. Tsai, and S.-T. Chen, "Using Xbox 360 kinect games on enhancing visual performance skills on institutionalized older adults with wheelchairs," in Digital Game and Intelligent Toy Enhanced Learning (DIGITEL), pp. 263-267, 2012. http://dx.doi.org/10.1109/digitel.2012.69   DOI
6 M. Ahn, and S. Jun, "Brain-Computer Interface System of Principles and technology trends." The Korean Institute of Information Scientists and Engineers, vol. 29, no. 4, pp. 42-53, 2011.
7 K. Lee, T. Lee, and S. Lee, "Motor Imagery Brain Signal Analysis for EEG-based Mouse Control," Korean Journal of Cognitive Science, vol. 21, no. 2, pp. 309-338, 2010. http://dx.doi.org/10.19066/cogsci.2010.21.2.004   DOI
8 D. J. Sturman and D. Zeltzer, "A survey of glove-based input," Computer Graphics and Applications, IEEE, vol. 14, no. 1, pp. 30-39, 1994. http://dx.doi.org/10.1109/38.250916   DOI
9 F.-S. Chen, C.-M. Fu, and C.-L. Huang, "Hand gesture recognition using a real-time tracking method and hidden Markov models," Image and vision computing, vol. 21, no. 8, pp. 745-758, 2003. http://dx.doi.org/10.1016/s0262-8856(03)00070-2   DOI
10 J. Kim, "Bio-mimetic Recognition of Action Sequence using Unsupervised Learning," Journal of Internet Computing and Service, vol. 15, no. 4, pp. 9-20, 2014. http://dx.doi.org/10.7472/jksii.2014.15.4.09   DOI
11 J. Kim, "BoF based Action Recognition using Spatio-Temporal 2D Descriptor," Journal of Internet Computing and Service, vol. 16, no. 3, pp. 21-32, 2015. http://dx.doi.org/10.7472/jksii.2015.16.3.21   DOI
12 T. Baudel and M. Beaudouin-Lafon, "Charade: remote control of objects using free-hand gestures," Communications of the ACM, vol. 36, no. 7, pp. 28-35, 1993. http://dx.doi.org/10.1145/159544.159562   DOI
13 A. A. Argyros and M. I. Lourakis, "Vision-based interpretation of hand gestures for remote control of a computer mouse," in Computer Vision in Human-Computer Interaction, ed: Springer, pp. 40-51. 2006. http://dx.doi.org/10.1007/11754336_5   DOI
14 G. Lee, D. Shin, and D. Shin, "NUI/NUX framework based on intuitive hand motion," Journal of Internet Computing and Services, vol. 15, no. 3, pp. 11-19, 2014. http://dx.doi.org/10.7472/jksii.2014.15.3.11   DOI
15 J. O. Lubar and J. F. Lubar, "Electroencephalographic biofeedback of SMR and beta for treatment of attention deficit disorders in a clinical setting," Biofeedback and self-regulation, vol. 9, no. 1, pp. 1-23, 1984. http://dx.doi.org/10.1007/bf00998842   DOI
16 M. Sterman, "Sensorimotor EEG operant conditioning: Experimental and clinical effects," The Pavlovian Journal of Biological Science: Official Journal of the Pavlovian, vol. 12, no. 2, pp. 63-92, 1977. http://dx.doi.org/10.1177/155005940003100110   DOI
17 R. Kjeldsen and J. Kender, "Toward the use of gesture in traditional user interfaces," in Automatic Face and Gesture Recognition, pp. 151-156, 1996. http://dx.doi.org/10.1109/afgr.1996.557257   DOI
18 D. Oh and K. Hong, "Studies on the quantification of relaxation numbers using EEG," HCI 2014, pp. 853-856, 2014.
19 J. Lee, J. Lee, Y. Myeong, J. Lee, and H. Seong, "Development of Motion Recognition Hand-Mouse Using OpenCV," Telcommunications Research Institue Proceedings, Sangji Uinversity, vol. 7, no. 2, pp. 15-19, 2011.
20 E. Sanchez-Nielsen, L. Anton-Canalis, and C. Guerra-Artal, "An autonomous and user-independent hand posture recognition system for vision-based interface tasks," in Current Topics in Artificial Intelligence, ed: Springer, pp. 113-122, 2006. http://dx.doi.org/10.1007/11881216_13   DOI
21 S. Kang, C. Kim, and W. Son, "Developing User-friendly Hand Mouse Interface via Gesture Recognition," The Korean Society of Broadcast Engineers, pp. 129-132, 2009.