• Title/Summary/Keyword: Hand user interface

Search Result 199, Processing Time 0.03 seconds

Intelligent interface using hand gestures recognition based on artificial intelligence (인공지능 기반 손 체스처 인식 정보를 활용한 지능형 인터페이스)

  • Hangjun Cho;Junwoo Yoo;Eun Soo Kim;Young Jae Lee
    • Journal of Platform Technology
    • /
    • v.11 no.1
    • /
    • pp.38-51
    • /
    • 2023
  • We propose an intelligent interface algorithm using hand gesture recognition information based on artificial intelligence. This method is functionally an interface that recognizes various motions quickly and intelligently by using MediaPipe and artificial intelligence techniques such as KNN, LSTM, and CNN to track and recognize user hand gestures. To evaluate the performance of the proposed algorithm, it is applied to a self-made 2D top-view racing game and robot control. As a result of applying the algorithm, it was possible to control various movements of the virtual object in the game in detail and robustly. And the result of applying the algorithm to the robot control in the real world, it was possible to control movement, stop, left turn, and right turn. In addition, by controlling the main character of the game and the robot in the real world at the same time, the optimized motion was implemented as an intelligent interface for controlling the coexistence space of virtual and real world. The proposed algorithm enables sophisticated control according to natural and intuitive characteristics using the body and fine movement recognition of fingers, and has the advantage of being skilled in a short period of time, so it can be used as basic data for developing intelligent user interfaces.

  • PDF

Design of Ball-based Mobile Haptic Interface (볼 기반의 모바일 햅틱 인터페이스 디자인)

  • Choi, Min-Woo;Kim, Joung-Hyun
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.122-128
    • /
    • 2009
  • In this paper, we present a design and an evaluation of a hand-held ball based haptic interface, named "TouchBall." Using a trackball mechanism, the device provides flexibility in terms of directional degrees of freedom. It also has an advantage of a direct transfer of force feedback through frictional touch (with high sensitivity), thus requiring only relatively small amount of inertia. This leads to a compact hand-held design appropriate for mobile and 3D interactive applications. The device is evaluated for the detection thresholds for directions of the force feedback and the perceived amount of directional force. The refined directionality information should combine with other modalities with less sensory conflict, enriching the user experience for a given application.

  • PDF

A Design and Implementation of Natural User Interface System Using Kinect (키넥트를 사용한 NUI 설계 및 구현)

  • Lee, Sae-Bom;Jung, Il-Hong
    • Journal of Digital Contents Society
    • /
    • v.15 no.4
    • /
    • pp.473-480
    • /
    • 2014
  • As the use of computer has been popularized these days, an active research is in progress to make much more convenient and natural interface compared to the existing user interfaces such as keyboard or mouse. For this reason, there is an increasing interest toward Microsoft's motion sensing module called Kinect, which can perform hand motions and speech recognition system in order to realize communication between people. Kinect uses its built-in sensor to recognize the main joint movements and depth of the body. It can also provide a simple speech recognition through the built-in microphone. In this paper, the goal is to use Kinect's depth value data, skeleton tracking and labeling algorithm to recognize information about the extraction and movement of hand, and replace the role of existing peripherals using a virtual mouse, a virtual keyboard, and a speech recognition.

Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System

  • Kim, Jinwoo;Ryu, Jae Hong;Han, Tae Man
    • ETRI Journal
    • /
    • v.37 no.4
    • /
    • pp.793-803
    • /
    • 2015
  • We propose a novel HMI UI/UX for an in-vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface-based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in-vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.

Development of Hand Shape Editor for Sign Language Motion (수화 동작을 위한 손 모양 편집 프로그램의 개발)

  • Oh, Young-Joon;Park, Kwang-Hyun;Bien, Zeung-Nam
    • Proceedings of the KIEE Conference
    • /
    • 2007.04a
    • /
    • pp.216-218
    • /
    • 2007
  • Korean Sign Language (KSL) is a communication method for the Deaf in Korea, and hand shape is one of important elements in sign language. In this paper, we developed a KSL hand shape editor to simply compose hand shape and connect it to a database. We can edit hand shape by a graphical user interface (GUI) on 3D virtual reality environment. Hand shape codes are connected to a sign word editor to synthesize sign motion and to decrease total amount of KSL data.

  • PDF

A Gesture Interface based on Hologram and Haptics Environments for Interactive and Immersive Experiences (상호작용과 몰입 향상을 위한 홀로그램과 햅틱 환경 기반의 동작 인터페이스)

  • Pyun, Hae-Gul;An, Haeng-A;Yuk, Seongmin;Park, Jinho
    • Journal of Korea Game Society
    • /
    • v.15 no.1
    • /
    • pp.27-34
    • /
    • 2015
  • This paper proposes a user interface for enhancing immersiveness and usability by combining hologram and haptic device with common Leap Motion. While Leap Motion delivers physical motion of user hand to control virtual environment, it is limited to handle virtual hands on screen and interact with virtual environment in one way. In our system, hologram is coupled with Leap Motion to improve user immersiveness by arranging real and virtual hands in the same place. Moreover, we provide a interaction prototype of sense by designing a haptic device to convey touch sense in virtual environment to user's hand.

Design and Implementation of a Stereoscopic Image Control System based on User Hand Gesture Recognition (사용자 손 제스처 인식 기반 입체 영상 제어 시스템 설계 및 구현)

  • Song, Bok Deuk;Lee, Seung-Hwan;Choi, HongKyw;Kim, Sung-Hoon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.3
    • /
    • pp.396-402
    • /
    • 2022
  • User interactions are being developed in various forms, and in particular, interactions using human gestures are being actively studied. Among them, hand gesture recognition is used as a human interface in the field of realistic media based on the 3D Hand Model. The use of interfaces based on hand gesture recognition helps users access media media more easily and conveniently. User interaction using hand gesture recognition should be able to view images by applying fast and accurate hand gesture recognition technology without restrictions on the computer environment. This paper developed a fast and accurate user hand gesture recognition algorithm using the open source media pipe framework and machine learning's k-NN (K-Nearest Neighbor). In addition, in order to minimize the restriction of the computer environment, a stereoscopic image control system based on user hand gesture recognition was designed and implemented using a web service environment capable of Internet service and a docker container, a virtual environment.

Intuitive Spatial Drawing System based on Hand Interface (손 인터페이스 기반 직관적인 공간 드로잉 시스템)

  • Ko, Ginam;Kim, Serim;Kim, YoungEun;Nam, SangHun
    • Journal of Digital Contents Society
    • /
    • v.18 no.8
    • /
    • pp.1615-1620
    • /
    • 2017
  • The development of Virtual Reality (VR)-related technologies has resulted in the improved performance of VR devices as well as affordable price arrangements, granting many users easy access to VR technology. VR drawing applications are not complicated for users and are also highly mature, being used for education, performances, and more. For controller-based spatial drawing interfaces, the user's drawing interface becomes constrained by the controller. This study proposes hand interaction based spatial drawing system where the user, who has never used the controller before, can intuitively use the drawing application by mounting LEAP Motion at the front of the Head Mounted Display (HMD). This traces the motion of the user's hand in front of the HMD to draw curved surfaces in virtual environments.

Interactive sound experience interface based on virtual concert hall (가상 콘서트홀 기반의 인터랙티브 음향 체험 인터페이스)

  • Cho, Hye-Seung;Kim, Hyoung-Gook
    • The Journal of the Acoustical Society of Korea
    • /
    • v.36 no.2
    • /
    • pp.130-135
    • /
    • 2017
  • In this paper, we propose an interface for interactive sound experience in the virtual concert hall. The proposed interface consists of two systems, called 'virtual acoustic position' and 'virtual active listening'. To provide these systems, we applied an artificial reverberation algorithm, multi-channel source separation and head-related transfer function. The proposed interface was implemented by using Unity. The interface provides the virtual concert hall to user through Oculus Rift, one of the virtual reality headsets. Moreover, we used Leap Motion as a control device to allow a user experience the system with free-hand. And user can experience the sound of the system through headphones.

A Study of Hand Gesture Recognition for Human Computer Interface (컴퓨터 인터페이스를 위한 Hand Gesture 인식에 관한 연구)

  • Chang, Ho-Jung;Baek, Han-Wook;Chung, Chin-Hyun
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.3041-3043
    • /
    • 2000
  • GUI(graphical user interface) has been the dominant platform for HCI(human computer interaction). The GUI-based style of interaction has made computers simpler and easier to use. However GUI will not easily support the range of interaction necessary to meet users' needs that are natural, intuitive, and adaptive. In this paper we study an approach to track a hand in an image sequence and recognize it, in each video frame for replacing the mouse as a pointing device to virtual reality. An algorithm for real time processing is proposed by estimating of the position of the hand and segmentation, considering the orientation of motion and color distribution of hand region.

  • PDF