• Title/Summary/Keyword: NUI(natural user interface)

Search Result 41, Processing Time 0.019 seconds

A Study on a Stress Measurement Algorithm Based on ECG Analysis of NUI-applied Tangible Game Users (NUI가 적용된 체감형 게임의 사용자 심전도 분석에 의한 스트레스 측정 알고리즘 연구)

  • Lee, Hyun-Ju;Shin, Dong-Il;Shin, Dong-Kyoo
    • Journal of Korea Game Society
    • /
    • v.13 no.5
    • /
    • pp.73-80
    • /
    • 2013
  • NUI(Natural User Interface) allows users to directly interact with surrounding digital devices using their voices or body motions without additional input/output interface devices. Our study has been carried out on human users who play a tangible game with body motions in the NUI-applied smart space. ECG was measured for 60 seconds duration before and after playing the game to determine user stress levels, and the measured signals were analyzed through an improved Random Forest algorithm. In order to experiment by a supervised learning, users additionally input whether or not the user felt stress. Moreover, the improved algorithm showed 1.04% higher accuracy than existing algorithm.

Design of Gesture based Interfaces for Controlling GUI Applications (GUI 어플리케이션 제어를 위한 제스처 인터페이스 모델 설계)

  • Park, Ki-Chang;Seo, Seong-Chae;Jeong, Seung-Moon;Kang, Im-Cheol;Kim, Byung-Gi
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.1
    • /
    • pp.55-63
    • /
    • 2013
  • NUI(Natural User Interfaces) has been developed through CLI(Command Line Interfaces) and GUI(Graphical User Interfaces). NUI uses many different input modalities, including multi-touch, motion tracking, voice and stylus. In order to adopt NUI to legacy GUI applications, he/she must add device libraries, modify relevant source code and debug it. In this paper, we propose a gesture-based interface model that can be applied without modification of the existing event-based GUI applications and also present the XML schema for the specification of the model proposed. This paper shows a method of using the proposed model through a prototype.

Gesture based Natural User Interface for e-Training

  • Lim, C.J.;Lee, Nam-Hee;Jeong, Yun-Guen;Heo, Seung-Il
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.577-583
    • /
    • 2012
  • Objective: This paper describes the process and results related to the development of gesture recognition-based natural user interface(NUI) for vehicle maintenance e-Training system. Background: E-Training refers to education training that acquires and improves the necessary capabilities to perform tasks by using information and communication technology(simulation, 3D virtual reality, and augmented reality), device(PC, tablet, smartphone, and HMD), and environment(wired/wireless internet and cloud computing). Method: Palm movement from depth camera is used as a pointing device, where finger movement is extracted by using OpenCV library as a selection protocol. Results: The proposed NUI allows trainees to control objects, such as cars and engines, on a large screen through gesture recognition. In addition, it includes the learning environment to understand the procedure of either assemble or disassemble certain parts. Conclusion: Future works are related to the implementation of gesture recognition technology for a multiple number of trainees. Application: The results of this interface can be applied not only in e-Training system, but also in other systems, such as digital signage, tangible game, controlling 3D contents, etc.

NUI/NUX of the Virtual Monitor Concept using the Concentration Indicator and the User's Physical Features (사용자의 신체적 특징과 뇌파 집중 지수를 이용한 가상 모니터 개념의 NUI/NUX)

  • Jeon, Chang-hyun;Ahn, So-young;Shin, Dong-il;Shin, Dong-kyoo
    • Journal of Internet Computing and Services
    • /
    • v.16 no.6
    • /
    • pp.11-21
    • /
    • 2015
  • As growing interest in Human-Computer Interaction(HCI), research on HCI has been actively conducted. Also with that, research on Natural User Interface/Natural User eXperience(NUI/NUX) that uses user's gesture and voice has been actively conducted. In case of NUI/NUX, it needs recognition algorithm such as gesture recognition or voice recognition. However these recognition algorithms have weakness because their implementation is complex and a lot of time are needed in training because they have to go through steps including preprocessing, normalization, feature extraction. Recently, Kinect is launched by Microsoft as NUI/NUX development tool which attracts people's attention, and studies using Kinect has been conducted. The authors of this paper implemented hand-mouse interface with outstanding intuitiveness using the physical features of a user in a previous study. However, there are weaknesses such as unnatural movement of mouse and low accuracy of mouse functions. In this study, we designed and implemented a hand mouse interface which introduce a new concept called 'Virtual monitor' extracting user's physical features through Kinect in real-time. Virtual monitor means virtual space that can be controlled by hand mouse. It is possible that the coordinate on virtual monitor is accurately mapped onto the coordinate on real monitor. Hand-mouse interface based on virtual monitor concept maintains outstanding intuitiveness that is strength of the previous study and enhance accuracy of mouse functions. Further, we increased accuracy of the interface by recognizing user's unnecessary actions using his concentration indicator from his encephalogram(EEG) data. In order to evaluate intuitiveness and accuracy of the interface, we experimented it for 50 people from 10s to 50s. As the result of intuitiveness experiment, 84% of subjects learned how to use it within 1 minute. Also, as the result of accuracy experiment, accuracy of mouse functions (drag(80.4%), click(80%), double-click(76.7%)) is shown. The intuitiveness and accuracy of the proposed hand-mouse interface is checked through experiment, this is expected to be a good example of the interface for controlling the system by hand in the future.

Visual Touch Recognition for NUI Using Voronoi-Tessellation Algorithm (보로노이-테셀레이션 알고리즘을 이용한 NUI를 위한 비주얼 터치 인식)

  • Kim, Sung Kwan;Joo, Young Hoon
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.64 no.3
    • /
    • pp.465-472
    • /
    • 2015
  • This paper presents a visual touch recognition for NUI(Natural User Interface) using Voronoi-tessellation algorithm. The proposed algorithms are three parts as follows: hand region extraction, hand feature point extraction, visual-touch recognition. To improve the robustness of hand region extraction, we propose RGB/HSI color model, Canny edge detection algorithm, and use of spatial frequency information. In addition, to improve the accuracy of the recognition of hand feature point extraction, we propose the use of Douglas Peucker algorithm, Also, to recognize the visual touch, we propose the use of the Voronoi-tessellation algorithm. Finally, we demonstrate the feasibility and applicability of the proposed algorithms through some experiments.

A Study on Gesture Interface through User Experience (사용자 경험을 통한 제스처 인터페이스에 관한 연구)

  • Yoon, Ki Tae;Cho, Eel Hea;Lee, Jooyoup
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.7 no.6
    • /
    • pp.839-849
    • /
    • 2017
  • Recently, the role of the kitchen has evolved from the space for previous survival to the space that shows the present life and culture. Along with these changes, the use of IoT technology is spreading. As a result, the development and diffusion of new smart devices in the kitchen is being achieved. The user experience for using these smart devices is also becoming important. For a natural interaction between a user and a computer, better interactions can be expected based on context awareness. This paper examines the Natural User Interface (NUI) that does not touch the device based on the user interface (UI) of the smart device used in the kitchen. In this method, we use the image processing technology to recognize the user's hand gesture using the camera attached to the device and apply the recognized hand shape to the interface. The gestures used in this study are proposed to gesture according to the user's context and situation, and 5 kinds of gestures are classified and used in the interface.

NUI/NUX framework based on intuitive hand motion (직관적인 핸드 모션에 기반한 NUI/NUX 프레임워크)

  • Lee, Gwanghyung;Shin, Dongkyoo;Shin, Dongil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.11-19
    • /
    • 2014
  • The natural user interface/experience (NUI/NUX) is used for the natural motion interface without using device or tool such as mice, keyboards, pens and markers. Up to now, typical motion recognition methods used markers to receive coordinate input values of each marker as relative data and to store each coordinate value into the database. But, to recognize accurate motion, more markers are needed and much time is taken in attaching makers and processing the data. Also, as NUI/NUX framework being developed except for the most important intuition, problems for use arise and are forced for users to learn many NUI/NUX framework usages. To compensate for this problem in this paper, we didn't use markers and implemented for anyone to handle it. Also, we designed multi-modal NUI/NUX framework controlling voice, body motion, and facial expression simultaneously, and proposed a new algorithm of mouse operation by recognizing intuitive hand gesture and mapping it on the monitor. We implement it for user to handle the "hand mouse" operation easily and intuitively.

Building Plan Research of Meeting System based on Multi-Touch Interface (멀티터치 인터페이스 회의시스템 구축 방안 연구)

  • Jang, Suk-Joo;Bak, Seon-Hui;Choi, Tae-Jun
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.14 no.5
    • /
    • pp.255-261
    • /
    • 2014
  • The development of the IT industry brought major changes in modern society. That change applies in all areas, that life is more and more convenient. work is more efficient, handling quickly and that make convenient life. Interface is a big factor in the center of these changes. that is more and more development and there is currently using NUI technology. NUI technology is not necessary that discrete input device. and that use natural behavior like touch, gesture. among them, the smart phone device is a representative of appllying the NUI technology. smart phone as well as NUI technology applies kiosk, big table and that use Various fields, such as culture, defense, and advertising industries. In this research, development multi-touch table based multi-touch meeting system. and proposal efficient system possible improvements about Existing meeting system.

Human-Computer Interface using sEMG according to the Number of Electrodes (전극 개수에 따른 근전도 기반 휴먼-컴퓨터 인터페이스의 정확도에 대한 연구)

  • Lee, Seulbi;Chee, Youngjoon
    • Journal of the HCI Society of Korea
    • /
    • v.10 no.2
    • /
    • pp.21-26
    • /
    • 2015
  • NUI (Natural User Interface) system interprets the user's natural movement or the signals from human body to the machine. sEMG (surface electromyogram) can be observed when there is any effort in muscle even without actual movement, which is impossible with camera and accelerometer based NUI system. In sEMG based movement recognition system, the minimal number of electrodes is preferred to minimize the inconvenience. We analyzed the decrease in recognition accuracy as decreasing the number of electrodes. For the four kinds of movement intention without movement, extension (up), flexion (down), abduction (right), and adduction (left), the multilayer perceptron classifier was used with the features of RMS (Root Mean Square) from sEMG. The classification accuracy was 91.9% in four channels, 87.0% in three channels, and 78.9% in two channels. To increase the accuracy in two channels of sEMG, RMSs from previous time epoch (50-200 ms) were used in addition. With the RMSs from 150 ms, the accuracy was increased from 78.9% to 83.6%. The decrease in accuracy with minimal number of electrodes could be compensated partly by utilizing more features in previous RMSs.

HOG-HOD Algorithm for Recognition of Multi-cultural Hand Gestures (다문화 손동작 인식을 위한 HOG-HOD 알고리즘)

  • Kim, Jiye;Park, Jong-Il
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.8
    • /
    • pp.1187-1199
    • /
    • 2017
  • In recent years, research about Natural User Interface (NUI) has become focused because NUI system can give natural feelings for users in virtual reality. Most important thing in NUI system is how to communicate with the computer system. There are many things to interact with users such as speech, hand gestures, body actions. Among them, hand gesture is suitable for the purpose of NUI because people often use a relatively high frequency in daily life and hand gesture have meaning only by itself. This hand gestures called multi-cultural hand gesture and we proposed the method to recognize this kind of hand gestures. Proposed method is composed of Histogram of Oriented Gradients (HOG) used for hand shape recognition and Histogram of Oriented Displacements (HOD) used for hand center point trajectory recognition.