• Title/Summary/Keyword: NUI/NUX

Search Result 11, Processing Time 0.025 seconds

NUI/NUX framework based on intuitive hand motion (직관적인 핸드 모션에 기반한 NUI/NUX 프레임워크)

  • Lee, Gwanghyung;Shin, Dongkyoo;Shin, Dongil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.11-19
    • /
    • 2014
  • The natural user interface/experience (NUI/NUX) is used for the natural motion interface without using device or tool such as mice, keyboards, pens and markers. Up to now, typical motion recognition methods used markers to receive coordinate input values of each marker as relative data and to store each coordinate value into the database. But, to recognize accurate motion, more markers are needed and much time is taken in attaching makers and processing the data. Also, as NUI/NUX framework being developed except for the most important intuition, problems for use arise and are forced for users to learn many NUI/NUX framework usages. To compensate for this problem in this paper, we didn't use markers and implemented for anyone to handle it. Also, we designed multi-modal NUI/NUX framework controlling voice, body motion, and facial expression simultaneously, and proposed a new algorithm of mouse operation by recognizing intuitive hand gesture and mapping it on the monitor. We implement it for user to handle the "hand mouse" operation easily and intuitively.

Arm Orientation Estimation Method with Multiple Devices for NUI/NUX

  • Sung, Yunsick;Choi, Ryong;Jeong, Young-Sik
    • Journal of Information Processing Systems
    • /
    • v.14 no.4
    • /
    • pp.980-988
    • /
    • 2018
  • Motion estimation is a key Natural User Interface/Natural User Experience (NUI/NUX) technology to utilize motions as commands. HTC VIVE is an excellent device for estimating motions but only considers the positions of hands, not the orientations of arms. Even if the positions of the hands are the same, the meaning of motions can differ according to the orientations of the arms. Therefore, when the positions of arms are measured and utilized, their orientations should be estimated as well. This paper proposes a method for estimating the arm orientations based on the Bayesian probability of the hand positions measured in advance. In experiments, the proposed method was used to measure the hand positions with HTC VIVE. The results showed that the proposed method estimated orientations with an error rate of about 19%, but the possibility of estimating the orientation of any body part without additional devices was demonstrated.

Determining UAV Flight Direction Control Method for Shooting the images of Multiple Users based on NUI/NUX (NUI/NUX 기반 복수의 사용자를 촬영하기 위한 UAV 비행방향 제어방법)

  • Kwak, Jeonghoon;Sung, Yunsick
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2018.05a
    • /
    • pp.445-446
    • /
    • 2018
  • 최근 무인항공기 (Unmanned Aerial Vehicle, UAV)에 장착한 카메라를 활용하여 사용자의 눈높이가 아닌 새로운 시각에서 사용자를 촬영한 영상을 제공한다. 사용자를 추적하며 촬영하기 위해 저전력 블루투스 (Bluetooth Low Energy, BLE) 신호, 영상, 그리고 Natural User Interface/Natual User Experience(NUI/NUX) 기술을 활용한다. BLE 신호로 사용자를 추적하는 경우 사용자의 후방에서 추적하며 사용자만을 추적하며 촬영 가능한 문제가 있다. 하지만 복수의 사용자를 전방에서 추적하며 촬영하는 방법이 필요하다. 본 논문에서는 복수의 사용자를 추적하며 전방에서 촬영하기 위해 UAV의 비행방향을 결정하는 방법을 설명한다. 복수의 사용자로부터 측정 가능한 BLE 신호들을 UAV에서 측정한다. 복수개의 BLE 신호의 변화를 활용하여 UAV의 비행방향을 결정한다.

NUI/NUX of the Virtual Monitor Concept using the Concentration Indicator and the User's Physical Features (사용자의 신체적 특징과 뇌파 집중 지수를 이용한 가상 모니터 개념의 NUI/NUX)

  • Jeon, Chang-hyun;Ahn, So-young;Shin, Dong-il;Shin, Dong-kyoo
    • Journal of Internet Computing and Services
    • /
    • v.16 no.6
    • /
    • pp.11-21
    • /
    • 2015
  • As growing interest in Human-Computer Interaction(HCI), research on HCI has been actively conducted. Also with that, research on Natural User Interface/Natural User eXperience(NUI/NUX) that uses user's gesture and voice has been actively conducted. In case of NUI/NUX, it needs recognition algorithm such as gesture recognition or voice recognition. However these recognition algorithms have weakness because their implementation is complex and a lot of time are needed in training because they have to go through steps including preprocessing, normalization, feature extraction. Recently, Kinect is launched by Microsoft as NUI/NUX development tool which attracts people's attention, and studies using Kinect has been conducted. The authors of this paper implemented hand-mouse interface with outstanding intuitiveness using the physical features of a user in a previous study. However, there are weaknesses such as unnatural movement of mouse and low accuracy of mouse functions. In this study, we designed and implemented a hand mouse interface which introduce a new concept called 'Virtual monitor' extracting user's physical features through Kinect in real-time. Virtual monitor means virtual space that can be controlled by hand mouse. It is possible that the coordinate on virtual monitor is accurately mapped onto the coordinate on real monitor. Hand-mouse interface based on virtual monitor concept maintains outstanding intuitiveness that is strength of the previous study and enhance accuracy of mouse functions. Further, we increased accuracy of the interface by recognizing user's unnecessary actions using his concentration indicator from his encephalogram(EEG) data. In order to evaluate intuitiveness and accuracy of the interface, we experimented it for 50 people from 10s to 50s. As the result of intuitiveness experiment, 84% of subjects learned how to use it within 1 minute. Also, as the result of accuracy experiment, accuracy of mouse functions (drag(80.4%), click(80%), double-click(76.7%)) is shown. The intuitiveness and accuracy of the proposed hand-mouse interface is checked through experiment, this is expected to be a good example of the interface for controlling the system by hand in the future.

Design of Multi-modal NUI/NUX using XML and KINECT (XML과 키넥트를 이용한 멀티모달 NUI/NUX 설계)

  • Lee, Gwang-Hyung;Shin, Dong Kyoo;Shin, Dong Il
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2013.11a
    • /
    • pp.1693-1696
    • /
    • 2013
  • 현재까지, 사람과 컴퓨터간의 인터페이스로 키보드와 마우스를 사용하여 왔다. 최근, 유비쿼터스 시대가 도래하면서 스마트 폰의 활용이 대두 되었고, 각 디바이스들은 하나로 통합되고 있다. 이에 따라 인터페이스도 NUI로 발전하였고 터치, 모션 트래킹, 음성, 표정 인식과 같은 멀티 모달 형식으로 더욱 높은 인지 능력과 직관적인 인터페이스가 되도록 각 디바이스 단계에서 개발되고 있다. 본 논문에서는 키넥트를 이용한 마커 없는 직관적인 손동작 인식과 XML 클라우드 기반의 각종 디바이스 통합 인터페이스 구현 설계를 제안한다.

Motion Recognition Sensor-based UAV Control Signal Generation Method (모션 인식 센서 기반의 UAV 제어 신호 생성 방법)

  • Kim, Donguk;Kim, Hyeok;Sung, Yunsick
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2016.04a
    • /
    • pp.682-685
    • /
    • 2016
  • 최근 무인항공기(Unmanned Aerial Vehicle, UAV)가 많은 보급됨에 따라서 다양한 방법으로 이를 제어하기 위한 노력들이 이루어지고 있다. 그 중 NUI/NUX를 이용하여 UAV를 접목하려는 시도가 있다. 전통적인 NUI/NUX는 가상현실에서 시각적인 체감에 많이 의존하고 있지만 동작 인식 센서를 기반으로 가상 에이전트를 제어하는 기법들도 소개되고 있다. 이 논문에서는 모션 인식 센서인 Myo를 사용하여 UAV를 제어하는 시스템을 제안한다. 제안한 시스템은 지상 통제소(Ground Control Station, GCS)의 서브시스템이며 전통적인 드론 조종 방법과 같이 양팔로 제어하기 때문에 두 개의 Myo를 사용한다.

Real-Time Recognition Method of Counting Fingers for Natural User Interface

  • Lee, Doyeob;Shin, Dongkyoo;Shin, Dongil
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.5
    • /
    • pp.2363-2374
    • /
    • 2016
  • Communication occurs through verbal elements, which usually involve language, as well as non-verbal elements such as facial expressions, eye contact, and gestures. In particular, among these non-verbal elements, gestures are symbolic representations of physical, vocal, and emotional behaviors. This means that gestures can be signals toward a target or expressions of internal psychological processes, rather than simply movements of the body or hands. Moreover, gestures with such properties have been the focus of much research for a new interface in the NUI/NUX field. In this paper, we propose a method for recognizing the number of fingers and detecting the hand region based on the depth information and geometric features of the hand for application to an NUI/NUX. The hand region is detected by using depth information provided by the Kinect system, and the number of fingers is identified by comparing the distance between the contour and the center of the hand region. The contour is detected using the Suzuki85 algorithm, and the number of fingers is calculated by detecting the finger tips in a location at the maximum distance to compare the distances between three consecutive dots in the contour and the center point of the hand. The average recognition rate for the number of fingers is 98.6%, and the execution time is 0.065 ms for the algorithm used in the proposed method. Although this method is fast and its complexity is low, it shows a higher recognition rate and faster recognition speed than other methods. As an application example of the proposed method, this paper explains a Secret Door that recognizes a password by recognizing the number of fingers held up by a user.

Multi-sensor based NUI/NUX framework for various interactive applications (다양한 상호작용 어플리케이션을 위한 이종 센서 NUI/NUX 프레임워크)

  • Zhang, Weiqiang;Xi, Yulong;Wen, Mingyun;Cho, Seoungjae;Chae, Jeongsook;Kim, Junoh;Um, Kyhyun;Cho, Kungeun
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2017.04a
    • /
    • pp.1077-1078
    • /
    • 2017
  • In this study, we implement a natural user interface/experience framework using multi-sensors: Microsoft Kinect, Leap Motion, and Myo Armband. The framework is designed for customers to use in various types of interactive applications. We integrate the functions of three sensors into an application and provide an interface for customers, who can use it to interact with a computer easily. The framework can track body information in real-time, and accurately recognize the motion of different body parts.

Path Generation Method of UAV Autopilots Using Max-Min Algorithm

  • Kwak, Jeonghoon;Sung, Yunsick
    • Journal of Information Processing Systems
    • /
    • v.14 no.6
    • /
    • pp.1457-1463
    • /
    • 2018
  • In recent times, Natural User Interface/Natural User Experience (NUI/NUX) technology has found widespread application across a diverse range of fields and is also utilized for controlling unmanned aerial vehicles (UAVs). Even if the user controls the UAV by utilizing the NUI/NUX technology, it is difficult for the user to easily control the UAV. The user needs an autopilot to easily control the UAV. The user needs a flight path to use the autopilot. The user sets the flight path based on the waypoints. UAVs normally fly straight from one waypoint to another. However, if flight between two waypoints is in a straight line, UAVs may collide with obstacles. In order to solve collision problems, flight records can be utilized to adjust the generated path taking the locations of the obstacles into consideration. This paper proposes a natural path generation method between waypoints based on flight records collected through UAVs flown by users. Bayesian probability is utilized to select paths most similar to the flight records to connect two waypoints. These paths are generated by selection of the center path corresponding to the highest Bayesian probability. While the K-means algorithm-based straight-line method generated paths that led to UAV collisions, the proposed method generates paths that allow UAVs to avoid obstacles.

Design of Photographing System for Multiple Users based on UAV (UAV 기반 복수의 사용자를 촬영하기 위한 촬영 시스템 설계 연구)

  • Kwak, Jeonghoon;Sung, Yunsick
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2018.05a
    • /
    • pp.479-480
    • /
    • 2018
  • 최근 무인항공기(Unmanned Aerial Vehicle, UAV)에 부착된 카메라로 사용자를 촬영함으로써 레저 및 여행 중 영상을 기록하기 위해 활용하고 있다. UAV에 부착된 카메라로 사용자를 촬영하기 위해 사용자가 직접 조종하거나 NUI/NUX 기술을 활용한다. UAV가 비행해야 되는 비행경로를 미리 설정하거나 단일 사용자를 추적해서 자동적으로 UAV가 비행하며 UAV에 부착된 카메라로 단일 사용자 중심으로 촬영한다. 하지만 레저 및 여행 중 영상을 기록하는 과정에서 단일 사용자 중심이 아니라 복수의 사용자를 고려하여 촬영해야 되는 경우가 있다. UAV가 복수의 사용자 위치를 고려하여 복수의 사용자를 촬영하는 시스템이 필요하다. 본 논문에서는 복수의 사용자를 촬영하기 위한 촬영 시스템을 설계한다. 촬영 시스템은 복수의 사용자 위치의 변화를 기반으로 UAV를 제어한다.