• Title/Summary/Keyword: 손 동작

Search Result 413, Processing Time 0.021 seconds

Visual Touchless User Interface for Window Manipulation (윈도우 제어를 위한 시각적 비접촉 사용자 인터페이스)

  • Kim, Jin-Woo;Jung, Kyung-Boo;Jeong, Seung-Do;Choi, Byung-Uk
    • Journal of KIISE:Software and Applications
    • /
    • v.36 no.6
    • /
    • pp.471-478
    • /
    • 2009
  • Recently, researches for user interface are remarkably processed due to the explosive growth of 3-dimensional contents and applications, and the spread class of computer user. This paper proposes a novel method to manipulate windows efficiently using only the intuitive motion of hand. Previous methods have some drawbacks such as burden of expensive device, high complexity of gesture recognition, assistance of additional information using marker, and so on. To improve the defects, we propose a novel visual touchless interface. First, we detect hand region using hue channel in HSV color space to control window using hand. The distance transform method is applied to detect centroid of hand and curvature of hand contour is used to determine position of fingertips. Finally, by using the hand motion information, we recognize hand gesture as one of predefined seven motions. Recognized hand gesture is to be a command to control window. In the proposed method, user can manipulate windows with sense of depth in the real environment because the method adopts stereo camera. Intuitive manipulation is also available because the proposed method supports visual touch for the virtual object, which user want to manipulate, only using simple motions of hand. Finally, the efficiency of the proposed method is verified via an application based on our proposed interface.

Gesture Recognition based on Mixture-of-Experts for Wearable User Interface of Immersive Virtual Reality (몰입형 가상현실의 착용식 사용자 인터페이스를 위한 Mixture-of-Experts 기반 제스처 인식)

  • Yoon, Jong-Won;Min, Jun-Ki;Cho, Sung-Bae
    • Journal of the HCI Society of Korea
    • /
    • v.6 no.1
    • /
    • pp.1-8
    • /
    • 2011
  • As virtual realty has become an issue of providing immersive services, in the area of virtual realty, it has been actively investigated to develop user interfaces for immersive interaction. In this paper, we propose a gesture recognition based immersive user interface by using an IR LED embedded helmet and data gloves in order to reflect the user's movements to the virtual reality environments effectively. The system recognizes the user's head movements by using the IR LED embedded helmet and IR signal transmitter, and the hand gestures with the data gathered from data gloves. In case of hand gestures recognition, it is difficult to recognize accurately with the general recognition model because there are various hand gestures since human hands consist of many articulations and users have different hand sizes and hand movements. In this paper, we applied the Mixture-of-Experts based gesture recognition for various hand gestures of multiple users accurately. The movement of the user's head is used to change the perspection in the virtual environment matching to the movement in the real world, and the gesture of the user's hand can be used as inputs in the virtual environment. A head mounted display (HMD) can be used with the proposed system to make the user absorbed in the virtual environment. In order to evaluate the usefulness of the proposed interface, we developed an interface for the virtual orchestra environment. The experiment verified that the user can use the system easily and intuituvely with being entertained.

  • PDF

A Study on Air Interface System (AIS) Using Infrared Ray (IR) Camera (적외선 카메라를 이용한 에어 인터페이스 시스템(AIS) 연구)

  • Kim, Hyo-Sung;Jung, Hyun-Ki;Kim, Byung-Gyu
    • The KIPS Transactions:PartB
    • /
    • v.18B no.3
    • /
    • pp.109-116
    • /
    • 2011
  • In this paper, we introduce non-touch style interface system technology without any touch style controlling mechanism, which is called as "Air-interface". To develop this system, we used the full reflection principle of infrared (IR) light and then user's hand is separated from the background with the obtained image at every frame. The segmented hand region at every frame is used as input data for an hand-motion recognition module, and the hand-motion recognition module performs a suitable control event that has been mapped into the specified hand-motion through verifying the hand-motion. In this paper, we introduce some developed and suggested methods for image processing and hand-motion recognition. The developed air-touch technology will be very useful for advertizement panel, entertainment presentation system, kiosk system and so many applications.

Face and Hand Tracking Algorithm for Sign Language Recognition (수화 인식을 위한 얼굴과 손 추적 알고리즘)

  • Park, Ho-Sik;Bae, Cheol-Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.11C
    • /
    • pp.1071-1076
    • /
    • 2006
  • In this paper, we develop face and hand tracking for sign language recognition system. The system is divided into two stages; the initial and tracking stages. In initial stage, we use the skin feature to localize face and hands of signer. The ellipse model on CbCr space is constructed and used to detect skin color. After the skin regions have been segmented, face and hand blobs are defined by using size and facial feature with the assumption that the movement of face is less than that of hands in this signing scenario. In tracking stage, the motion estimation is applied only hand blobs, in which first and second derivative are used to compute the position of prediction of hands. We observed that there are errors in the value of tracking position between two consecutive frames in which velocity has changed abruptly. To improve the tracking performance, our proposed algorithm compensates the error of tracking position by using adaptive search area to re-compute the hand blobs. The experimental results indicate that our proposed method is able to decrease the prediction error up to 96.87% with negligible increase in computational complexity of up to 4%.

Implementation of Real-time Recognition System for Continuous Korean Sign Language(KSL) mixed with Korean Manual Alphabet(KMA) (지문자를 포함한 연속된 한글 수화의 실시간 인식 시스템 구현)

  • Lee, Chan-Su;Kim, Jong-Sung;Park, Gyu-Tae;Jang, Won;Bien, Zeung-Nam
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.35C no.6
    • /
    • pp.76-87
    • /
    • 1998
  • This paper deals with a system which recognizes dynmic hand gestures, Korean Sign Language(KSL), mixed with static hand gesture, Korean Manual Alphabet(KMA), continuously. Recognition of continuous hand gestures is very difficult for lack of explicit tokens indicating beginning and ending of signs and for complexity of each gesture. In this paper, state automata is used for segmenting sequential signs into individual ones, and basic elements of KSL and KMA, which consist of 14 hand directions, 23 hand postures and 14 hand orientations are used for recognition of complex gestures under consideration of expandability. Using a pair of CyberGlove and Polhemus sensor, this system recognizes 131 Korean signs and 31 KMA's in real-time with recognition rate 94.3% for KSL excluding no recognition case and 96.7% for KMA.

  • PDF

(Speaking by using hands') - Wearable PC for the verbally handicapped (('손으로 말해요') - 언어 장애인의 의사소통을 돕기 위한 웨어러블 PC)

  • Kim, Kyung-Hee;Kim, Kee-Hyung;Kim, Ha-Na;Park, Ji-Woo;Sun, Jung-Hee;Lee, Jae-Hyung;Jung, Jong-Phil
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.52-56
    • /
    • 2006
  • 선천적인 원인으로 인해 언어 장애를 겪고 있는 사람들이나 후두암 등 후천적 질환이 원인이 되어 의사 소통에 불편을 겪는 사람들이 있다. 본 논문에서 제안하는 '손으로 말해요'는 이러한 사람들의 의사 표현을 돕기 위한 장치이다. 뇌에 손상이 있거나 성대에 손상이 있어 말하는 것이 어려운 장애인들이 장갑모양의 입력장치와 목 부위에 부착된 스피커를 이용해 말할 수 있다. 이 스피커를 통해 흘러나오는 인공의 목소리로 의사 소통할 수 있다. 언어장애인을 위한 본 웨어러블 PC 의 특징은 먼저 블루투스 모듈을 이용한 간단한 손가락 동작만으로도 입력이 가능한 장갑 모양의 입력장치이다. 사용이 불편한 엄지 손가락을 제외한 나머지 네 손가락에 각 하나씩 양손 총 8 개의 스위치가 부착되어 있다. 사용자는 손가락을 굽혀 손바닥에 스위치가 닿도록 하거나 책상 등과 같이 편평한 탁자에 손가락 끝을 닿게 하여 스위치를 누를 수 있다. 장갑의 키 배열은 PC 키보드의 배열과 같아 사용자가 쉽게 적응할 수 있다. 다음으로 본 장치는 자연스러운 목소리가 흘러나오는 음성 합성 모듈을 탑재하였다. 모듈의 출력 음성은 실제 말하는 것과 같은 자연스러운 억양을 지니고 있으며, 스피커는 목소리가 흘러나오는 위치가 자연스럽도록 목 부위에 부착되어 있다. 그리고 HMD(Head Mounted Display)를 탑재하여 자신이 텍스트를 정확하게 입력하고 있는지 이를 통해 확인할 수 있다. 장갑을 제외한 모든 장비는 가방에 탑재하여 착용이 편리하도록 하였고, 장갑은 블루투스 모듈을 이용하여 이용에 불편을 주는 전선을 제거하였다. 본 논문에서 제안하는 '손으로 말해요'는 간단한 손가락 동작을 이용하여 자연스러운 목소리로 말하고자 하는 내용을 전달할 수 있기 때문에 언어장애로 불편함을 겪는 사람들에게 도움을 줄 수 있다.

  • PDF

8-Straight Line Directions Recognition Algorithm for Hand Gestures Using Coordinate Information (좌표 정보를 이용한 손동작 직선 8 방향 인식 알고리즘)

  • SODGEREL, BYAMBASUREN;Kim, Yong-Ki;Kim, Mi-Hye
    • Journal of Digital Convergence
    • /
    • v.13 no.9
    • /
    • pp.259-267
    • /
    • 2015
  • In this paper, we proposed the straight line determination method and the algorithm for 8 directions determination of straight line using the coordinate information and the property of trigonometric function. We conduct an experiment that is 8 hand gestures are carried out 100 times each, a total of 800 times. And the accuracy for the 8 derection determination algorithm is showed the diagonal direction to the left upper side shows the highest accuracy as 92%, and the direction to the left side, the diagonal direction to the right upper side and the diagonal direction to the right bottom side show the lowest accuracy as 82%. This method with coordinate information through image processing than the existing recognizer and the recognition through learning process is possible using a hand gesture recognition gesture.

The Convergence Effect of Task-Oriented Training and Vibration Stimulation, Transcranial Direct Current Stimulation to Improve Upper Limb Function in Stroke (뇌졸중 환자의 상지기능 개선을 위한 과제 지향적 훈련과 진동 자극, 경두개 직류 전류 자극의 융합 효과)

  • Kim, Sun-Ho
    • Journal of the Korea Convergence Society
    • /
    • v.11 no.9
    • /
    • pp.31-37
    • /
    • 2020
  • The purpose of this study was to investigate the Effect of transcranial direct current stimulation convergence task-oriented training combined with vibration stimulation on hand dexterity and upper limb function in stroke patients. One time 30 minutes 5 times a week for 4 weeks. experimental group of transcranial direct current stimulation convergence task-oriented training combined with vibration stimulation and control group of the task-oriented training combined with vibration stimulation were divided into 10 members. Hand dexterity and upper limb recovery were measured. The experimental group and the control group showed significant improvement in hand dexterity and grasping(p<.05), grasping, and gross movement(p<.05). The experimental group showed a significant improvement in hand dexterity and grasp and grip than the control group. Effect size showed more than small effect in all evaluation items. Based on the results of this study, it is considered that more effective and efficient rehabilitation treatment can be performed in the clinic.

Intelligent interface using hand gestures recognition based on artificial intelligence (인공지능 기반 손 체스처 인식 정보를 활용한 지능형 인터페이스)

  • Hangjun Cho;Junwoo Yoo;Eun Soo Kim;Young Jae Lee
    • Journal of Platform Technology
    • /
    • v.11 no.1
    • /
    • pp.38-51
    • /
    • 2023
  • We propose an intelligent interface algorithm using hand gesture recognition information based on artificial intelligence. This method is functionally an interface that recognizes various motions quickly and intelligently by using MediaPipe and artificial intelligence techniques such as KNN, LSTM, and CNN to track and recognize user hand gestures. To evaluate the performance of the proposed algorithm, it is applied to a self-made 2D top-view racing game and robot control. As a result of applying the algorithm, it was possible to control various movements of the virtual object in the game in detail and robustly. And the result of applying the algorithm to the robot control in the real world, it was possible to control movement, stop, left turn, and right turn. In addition, by controlling the main character of the game and the robot in the real world at the same time, the optimized motion was implemented as an intelligent interface for controlling the coexistence space of virtual and real world. The proposed algorithm enables sophisticated control according to natural and intuitive characteristics using the body and fine movement recognition of fingers, and has the advantage of being skilled in a short period of time, so it can be used as basic data for developing intelligent user interfaces.

  • PDF

A Real-time Hand Pose Recognition Method with Hidden Finger Prediction (은닉된 손가락 예측이 가능한 실시간 손 포즈 인식 방법)

  • Na, Min-Young;Choi, Jae-In;Kim, Tae-Young
    • Journal of Korea Game Society
    • /
    • v.12 no.5
    • /
    • pp.79-88
    • /
    • 2012
  • In this paper, we present a real-time hand pose recognition method to provide an intuitive user interface through hand poses or movements without a keyboard and a mouse. For this, the areas of right and left hands are segmented from the depth camera image, and noise removal is performed. Then, the rotation angle and the centroid point of each hand area are calculated. Subsequently, a circle is expanded at regular intervals from a centroid point of the hand to detect joint points and end points of the finger by obtaining the midway points of the hand boundary crossing. Lastly, the matching between the hand information calculated previously and the hand model of previous frame is performed, and the hand model is recognized to update the hand model for the next frame. This method enables users to predict the hidden fingers through the hand model information of the previous frame using temporal coherence in consecutive frames. As a result of the experiment on various hand poses with the hidden fingers using both hands, the accuracy showed over 95% and the performance indicated over 32 fps. The proposed method can be used as a contactless input interface in presentation, advertisement, education, and game applications.