• Title/Summary/Keyword: 손 제스처 인식

Search Result 131, Processing Time 0.025 seconds

Design and Evaluation of a Hand-held Device for Recognizing Mid-air Hand Gestures (공중 손동작 인식을 위한 핸드 헬드형 기기의 설계 및 평가)

  • Seo, Kyeongeun;Cho, Hyeonjoong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.4 no.2
    • /
    • pp.91-96
    • /
    • 2015
  • We propose AirPincher, a handheld pointing device for recognizing delicate mid-air hand gestures to control a remote display. AirPincher is designed to overcome disadvantages of the two kinds of existing hand gesture-aware techniques such as glove-based and vision-based. The glove-based techniques cause cumbersomeness of wearing gloves every time and the vision-based techniques incur performance dependence on distance between a user and a remote display. AirPincher allows a user to hold the device in one hand and to generate several delicate finger gestures. The gestures are captured by several sensors proximately embedded into AirPincher. These features help AirPincher avoid the aforementioned disadvantages of the existing techniques. We experimentally find an efficient size of the virtual input space and evaluate two types of pointing interfaces with AirPincher for a remote display. Our experiments suggest appropriate configurations to use the proposed device.

Recognition of Natural Hand Gesture by Using HMM (HMM을 이용한 자연스러운 손동작 인식)

  • Kim, A-Ram;Rhee, Sang-Yong
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.22 no.5
    • /
    • pp.639-645
    • /
    • 2012
  • In this paper, we propose a method that gives motion command to a mobile robot to recognize human being's hand gesture. Former way of the robot-controlling system with the movement of hand used several kinds of pre-arranged gesture, therefore the ordering motion was unnatural. Also it forced people to study the pre-arranged gesture, making it more inconvenient. To solve this problem, there are many researches going on trying to figure out another way to make the machine to recognize the movement of the hand. In this paper, we used third-dimensional camera to obtain the color and depth data, which can be used to search the human hand and recognize its movement based on it. We used HMM method to make the proposed system to perceive the movement, then the observed data transfers to the robot making it to move at the direction where we want it to be.

Detection Accuracy Improvement of Hang Region using Kinect (키넥트를 이용한 손 영역 검출의 정확도 개선)

  • Kim, Heeae;Lee, Chang Woo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.11
    • /
    • pp.2727-2732
    • /
    • 2014
  • Recently, the researches of object tracking and recognition using Microsoft's Kinect are being actively studied. In this environment human hand detection and tracking is the most basic technique for human computer interaction. This paper proposes a method of improving the accuracy of the detected hand region's boundary in the cluttered background. To do this, we combine the hand detection results using the skin color with the extracted depth image from Kinect. From the experimental results, we show that the proposed method increase the accuracy of the hand region detection than the method of detecting a hand region with a depth image only. If the proposed method is applied to the sign language or gesture recognition system it is expected to contribute much to accuracy improvement.

A Study on the VR Payment System using Hand Gesture Recognition (손 제스쳐 인식을 활용한 VR 결제 시스템 연구)

  • Kim, Kyoung Hwan;Lee, Won Hyung
    • Journal of the Korean Society for Computer Game
    • /
    • v.31 no.4
    • /
    • pp.129-135
    • /
    • 2018
  • Electronic signatures, QR codes, and bar codes are used in payment systems used in real life. Research has begun on the payment system implemented in the VR environment. This paper proposes a VR electronic sign system that uses hand gesture recognition to implement an existing payment system in a VR environment. In a VR system, you can not hit the keyboard or touch the mouse. There can be several ways to configure a payment system with a VR controller. Electronic signage using hand gesture recognition is one of them, and hand gesture recognition can be classified by the Warping Methods, Statistical Methods, and Template Matching methods. In this paper, the payment system was configured in VR using the $p algorithm belonging to the Template Matching method. To create a VR environment, we implemented a paypal system where actual payment is made using Unity3D and Vive equipment.

8-Straight Line Directions Recognition Algorithm for Hand Gestures Using Coordinate Information (좌표 정보를 이용한 손동작 직선 8 방향 인식 알고리즘)

  • SODGEREL, BYAMBASUREN;Kim, Yong-Ki;Kim, Mi-Hye
    • Journal of Digital Convergence
    • /
    • v.13 no.9
    • /
    • pp.259-267
    • /
    • 2015
  • In this paper, we proposed the straight line determination method and the algorithm for 8 directions determination of straight line using the coordinate information and the property of trigonometric function. We conduct an experiment that is 8 hand gestures are carried out 100 times each, a total of 800 times. And the accuracy for the 8 derection determination algorithm is showed the diagonal direction to the left upper side shows the highest accuracy as 92%, and the direction to the left side, the diagonal direction to the right upper side and the diagonal direction to the right bottom side show the lowest accuracy as 82%. This method with coordinate information through image processing than the existing recognizer and the recognition through learning process is possible using a hand gesture recognition gesture.

Finger Recognition and Virtual Touch Service using AI (AI를 활용한 손가락 인식 및 가상 터치 서비스)

  • A-Ra Cho;Seung-Bae Yoo;Byeong-Hun Yun;Hyung-Ju Cho;Gwang-Rim Ha
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2023.11a
    • /
    • pp.938-939
    • /
    • 2023
  • 코로나-19로 인해 비접촉 서비스의 중요성이 더욱 대두되고 있다. 키보드나 마우스와 같은 기존 입력 장치를 대체하기 위해 사람들은 디지털 기기에서 손을 사용하여 자연스럽고 간단한 입력을 할 수 있게 되었다. 본 논문에서는 미디어파이프(MediaPipe)와 LSTM(Long Short-Term Memory) 딥러닝을 활용하여 손 제스처를 학습하고 비접촉 입력 장치로 구현하는 방법을 제시한다. 이러한 기술은 가상현실(VR; Virtual Reality), 증강현실(AR; Augmented Reality), 메타버스, 키오스크 등에서 활용 가능성이 크다.

Skin segmentation and hand tracking for gesture recognition (제스처 인식을 위한 피부영역 분할기법 및 추적)

  • Chae, Seung-Ho;Seo, Jong-Hoon;Han, Tack-Don
    • Proceedings of the Korea Multimedia Society Conference
    • /
    • 2012.05a
    • /
    • pp.371-373
    • /
    • 2012
  • 본 논문에서는 컬러 영상 기반에서 배경에 강인한 피부 영역 검출 기법을 제안하고 손 인식기법을 활용한 응용프로그램을 제안한다. 코드북 모델[1]을 이용하여 배경/전경을 분리하고, 분리된 전경에서 피부색정보를 이용하여 관심영역을 도출한다. 피부 영역을 검출하기 위한 단계에서는 YCbCr, HSV, LUV 색상 모델의 혼합하여 피부색 후보 영역에 대한 임계구간을 통해 강인한 피부 영역을 분할한다. 분할된 영역을 관심영역으로 설정하고 Kalman filter를 이용하여 영역을 추적한다. 결과적으로 복잡하고 고정된 배경에서 조명에 강인한 피부 영역 분할 및 추적이 가능하며 이를 응용한 사용자 인터페이스로 사용될 수 있다.

  • PDF

Sign Language Translation Wearable Device Using Motion Recognition (모션 인식을 이용한 수화 번역 웨어러블 기기)

  • Jun-yeong Lee;Hyeon-su Kang;Sung-jun Kim;Jun-ho Son;Dong-jun Yoo;Yang-woo Park
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2023.07a
    • /
    • pp.453-454
    • /
    • 2023
  • 현재 선천적인 청각장애인이나 언어 장애가 있는 사람은 다른 사람과의 대화에 많은 불편을 겪고 있다. 매장을 이용하기 어려움은 물론 언어전달 능력이 떨어지기 때문에 간단한 의사소통을 통한 서로 간의 교류 또한 불편함을 감수해야 한다. 현재는 따로 디스플레이가 내장된 장치를 이용하여 지정된 장소에서 수화를 번역해야 하는 불편함을 해당 문제 해결을 위해 본 연구에서는 딥러닝을 적용하여 수화를 인식하고 번역하여 디스플레이에 텍스트를 출력해주는 시스템을 개발하였다. AI 프레임워크 MediaPipe와 SVM 알고리즘을 라즈베리파이에 적용하여 구현하였다. 개발한 시스템은 제스처에 대한 번역 결과를 제공한다. 기존의 지정된 장소가 아닌 대화가 필요한 모든 장소에서 번역이 가능하도록 개선하여 청각장애인과 언어장애가 있는 사람들과 소통의 불편함을 줄일 수 있을 것으로 기대할 수 있다.

  • PDF

RealBook: A Tangible Electronic Book Based on the Interface of TouchFace-V (RealBook: TouchFace-V 인터페이스 기반 실감형 전자책)

  • Song, Dae-Hyeon;Bae, Ki-Tae;Lee, Chil-Woo
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.12
    • /
    • pp.551-559
    • /
    • 2013
  • In this paper, we proposed a tangible RealBook based on the interface of TouchFace-V which is able to recognize multi-touch and hand gesture. The TouchFace-V is applied projection technology on a flat surface such as table, without constraint of space. The system's configuration is addressed installation, calibration, and portability issues that are most existing front-projected vision-based tabletop display. It can provide hand touch and gesture applying computer vision by adopting tracking technology without sensor and traditional input device. The RealBook deals with the combination of each advantage of analog sensibility on texts and multimedia effects of e-book. Also, it provides digitally created stories that would differ in experiences and environments with interacting users' choices on the interface of the book. We proposed e-book that is new concept of electronic book; named RealBook, different from existing and TouchFace-V interface, which can provide more direct viewing, natural and intuitive interactions with hand touch and gesture.

An Implementation of Real-Time Numeral Recognizer Based on Hand Gesture Using Both Gradient and Positional Information (기울기와 위치 정보를 이용한 손동작기반 실시간 숫자 인식기 구현)

  • Kim, Ji-Ho;Park, Yang-Woo;Han, Kyu-Phil
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.3
    • /
    • pp.199-204
    • /
    • 2013
  • An implementation method of real-time numeral recognizer based on gesture is presented in this paper for various information devices. The proposed algorithm steadily captures the motion of a hand on 3D open space with the Kinect sensor. The captured hand motion is simplified with PCA, in order to preserve the trace consistency and to minimize the trace variations due to noises and size changes. In addition, we also propose a new HMM using both the gradient and the positional features of the simplified hand stroke. As the result, the proposed algorithm has robust characteristics to the variations of the size and speed of hand motion. The recognition rate is increased up to 30%, because of this combined model. Experimental results showed that the proposed algorithm gives a high recognition rate about 98%.