• Title/Summary/Keyword: 손

Search Result 13,095, Processing Time 0.039 seconds

Robust hand segmentation on hand over face occlusion (손과 얼굴의 겹침 현상을 고려한 강인한 손 추출 알고리즘)

  • Kim, Ha-Young;Seo, Jon-Hoon;Han, Tack-Don
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2012.06c
    • /
    • pp.397-399
    • /
    • 2012
  • 본 연구에서는 픽셀 값이 근소한 차이를 보이는 얼굴과 손이 겹쳤을 때 손을 효과적으로 추출하는 방법을 제안한다. 제안된 알고리즘은 깊이 영상에서 연결요소를 찾음으로써 깊이 정보가 다른 손과 얼굴 영역을 분리하게 된다. 기존의 복잡한 방법을 생략하고, 이진화 영상에 적용하던 connected component labeling 기법을 gray 영상에 적용하여 깊이 영역이 비슷한 영역을 분리하였다. 이로 인하여, 손의 피부색상과 비슷한 색상을 가지는 얼굴과 손의 겹침 현상에서 강건한 손 추출 결과를 얻을 수 있었다. 그리고 보다 자연스러운 제스쳐 인식 시스템을 구축할 수 있다.

Improved Hand Region Tracking Using Face Region Tracking (얼굴 영역 추적을 통한 향상된 손 영역 추척에 관한 연구)

  • Son, Jisoo;Kim, Dongkyu;Lee, Seung Ho;Ro, Yong Man
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2015.04a
    • /
    • pp.884-887
    • /
    • 2015
  • 손 영역 추적에서는 피부색이 가장 유용한 정보 중 하나이다. 그런데 손 영역과 얼굴 영역이 서로 겹치거나 가까이 있을 때 손 영역의 추적결과인 바운딩이 얼굴 영역까지 불필요하게 확장되는 문제점이 존재한다. 본 논문에서는 얼굴 영역 추적결과를 손 영역 추적에 사용한다. 구체적으로, 얼굴 영역 내에 손 영역의 바운딩이 침투하지 않도록 한다. 실험결과, 얼굴 영역 추적결과를 사용한 경우 그렇지 않은 경우에 비해 손 영역의 바운딩을 정확히 예측하였으며 초당 30~35 프레임의 빠른 계산속도를 유지하였다.

Robot Control using Vision based Hand Gesture Recognition (비전기반 손 제스처 인식을 통한 로봇 컨트롤)

  • Kim, Dae-Soo;Kang, Hang-Bong
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2007.11a
    • /
    • pp.197-200
    • /
    • 2007
  • 본 논문에서는 로봇 컨트롤 시스템을 위해 입력 받은 영상부터 몇 가지의 손 제스처를 인식하는 비전기반 손 제스처 인식방법을 제안한다. 로봇으로부터 입력 받은 이미지는 로봇의 위치, 주변환경, 조명 등 여러 요인에 따라 다양하게 존재한다. 본 논문은 다양한 환경에서 입력되는 영상으로부터 시스템이 로봇 컨트롤을 위해 미리 지정한 몇 가지 제스처를 인식하도록 한다. 먼저 이미지 조명 변화에 강한 손 제스처 인식을 위하여 레티넥스 이미지 정규화를 적용한 후, YCrCb 공간 상에서 입력된 영상에서 손 영역을 검출 후 위치를 추정한다. 인식된 손 영역에서 특징벡터를 추출함으로서 입력 영상내의 존재할 수 있는 손의 크기나 손의 회전각도 등에 상관없이 필요로 하는 제스처를 인식하도록 한다. 제안된 제스처 인식 결과는 로봇컨트롤을 위한 기존의 제스처인식과 비교하여 성능을 측정하였다.

A Method of Hand Recognition for Virtual Hand Control of Virtual Reality Game Environment (가상 현실 게임 환경에서의 가상 손 제어를 위한 사용자 손 인식 방법)

  • Kim, Boo-Nyon;Kim, Jong-Ho;Kim, Tae-Young
    • Journal of Korea Game Society
    • /
    • v.10 no.2
    • /
    • pp.49-56
    • /
    • 2010
  • In this paper, we propose a control method of virtual hand by the recognition of a user's hand in the virtual reality game environment. We display virtual hand on the game screen after getting the information of the user's hand movement and the direction thru input images by camera. We can utilize the movement of a user's hand as an input interface for virtual hand to select and move the object. As a hand recognition method based on the vision technology, the proposed method transforms input image from RGB color space to HSV color space, then segments the hand area using double threshold of H, S value and connected component analysis. Next, The center of gravity of the hand area can be calculated by 0 and 1 moment implementation of the segmented area. Since the center of gravity is positioned onto the center of the hand, the further apart pixels from the center of the gravity among the pixels in the segmented image can be recognized as fingertips. Finally, the axis of the hand is obtained as the vector of the center of gravity and the fingertips. In order to increase recognition stability and performance the method using a history buffer and a bounding box is also shown. The experiments on various input images show that our hand recognition method provides high level of accuracy and relatively fast stable results.

A Study on the Hand-washing Awareness and Practices of Female University Students (여자 대학생의 손 씻기 의식과 실천에 관한 연구)

  • Kim, Jong-Gyu;Kim, Joong-Soon
    • Journal of Food Hygiene and Safety
    • /
    • v.24 no.2
    • /
    • pp.128-135
    • /
    • 2009
  • Hand-washing is one of the most important factors in infection control and in preventing cross-contamination. The objective of this study was to investigate female university students' awareness of hand-washing, their hand washing practices, and the difference between their awareness and practices. A self-administered questionnaire survey and direct observation in restrooms were separately carried out in a university campus and over four weeks' period. A total of 97.4% of the survey respondents claimed to wash their hands after using toilet, and 98.2% of the observed students actually did so according to the unnoticed observational study. However, only 6.3% of the students who washed their hands in the direct observation washed for more than 10 seconds, although 46.4% of respondents in the survey reported that they usually wash their hands for more than 10 seconds. Among the observed students who washed their hands, only 0.9% used soap, and 0.9% washed four parts of their hands. Paper towel was the most common hand-drying method in the direct observation and also in the survey. Significant differences were found in duration, use of soap, part of washing, and hand-drying method between the questionnaire survey and the direct observation (p<0.05). This study indicates that there is a noticeable difference between the awareness of hand-washing and hand-washing practices among female university students. Further research should examine hand-washing practices of female university students in restrooms outside the university campus.

A Study on the Hand Hygiene of Food Handlers of Food Court and Cafeteria in University Campus (대학 구내 휴게음식점 종사자의 손 위생관리에 관한 연구)

  • Kim, Jong-Gyu;Park, Jeong-Yeong;Kim, Joong-Soon
    • Journal of Food Hygiene and Safety
    • /
    • v.25 no.2
    • /
    • pp.133-142
    • /
    • 2010
  • This study was performed to investigate awareness of hand washing, hand washing behavior, and the levels of indicator microorganisms on hands of food handlers who work in the food court and cafeteria of a university campus. The three methods used were questionnaire survey by interview, direct observation in restrooms, and microbiological examination according to the Food Code of Korea. A positive attitude toward hand washing compliance was reported by the responded food handlers; however, improper hand washing and poor hand hygiene of the food handlers were recognized by the unnoticed direct observation. Significant differences were found between the questionnaire survey and the direct observation (p < 0.05) in hand washing compliance after using the toilet, duration of hand washing, use of hand washing agent, washing different parts of the hands, hand-drying method, temperature of water, and method of turning off the water. Samples taken from their hands before work showed higher level of standard plate count, total and fecal coliforms, and Escherichia coli than those taken after washing with water. After washing hands with antiseptic liquid soap, the bacterial populations including Staphylococcus aureus on hands were dramatically reduced. This study indicates that there is a remarkable difference between the food handlers' awareness of hand washing and their hand washing behavior. Poor hand washing compliance and hand hygiene were indicated by the positive results of total and fecal coliforms, E. coli, and S. aureus on hands of some food handlers. The findings of this study suggest that the hand hygiene of the food handlers need to be improved. More training/education on hand washing and hand hygiene of the food handlers should be necessary.

A Study on Vision-based Robust Hand-Posture Recognition Using Reinforcement Learning (강화 학습을 이용한 비전 기반의 강인한 손 모양 인식에 대한 연구)

  • Jang Hyo-Young;Bien Zeung-Nam
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.43 no.3 s.309
    • /
    • pp.39-49
    • /
    • 2006
  • This paper proposes a hand-posture recognition method using reinforcement learning for the performance improvement of vision-based hand-posture recognition. The difficulties in vision-based hand-posture recognition lie in viewing direction dependency and self-occlusion problem due to the high degree-of-freedom of human hand. General approaches to deal with these problems include multiple camera approach and methods of limiting the relative angle between cameras and the user's hand. In the case of using multiple cameras, however, fusion techniques to induce the final decision should be considered. Limiting the angle of user's hand restricts the user's freedom. The proposed method combines angular features and appearance features to describe hand-postures by a two-layered data structure and reinforcement learning. The validity of the proposed method is evaluated by appling it to the hand-posture recognition system using three cameras.

RGB Camera-based Real-time 21 DoF Hand Pose Tracking (RGB 카메라 기반 실시간 21 DoF 손 추적)

  • Choi, Junyeong;Park, Jong-Il
    • Journal of Broadcast Engineering
    • /
    • v.19 no.6
    • /
    • pp.942-956
    • /
    • 2014
  • This paper proposes a real-time hand pose tracking method using a monocular RGB camera. Hand tracking has high ambiguity since a hand has a number of degrees of freedom. Thus, to reduce the ambiguity the proposed method adopts the step-by-step estimation scheme: a palm pose estimation, a finger yaw motion estimation, and a finger pitch motion estimation, which are performed in consecutive order. Assuming a hand to be a plane, the proposed method utilizes a planar hand model, which facilitates a hand model regeneration. The hand model regeneration modifies the hand model to fit a current user's hand, and improves robustness and accuracy of the tracking results. The proposed method can work in real-time and does not require GPU-based processing. Thus, it can be applied to various platforms including mobile devices such as Google Glass. The effectiveness and performance of the proposed method will be verified through various experiments.

A Study on Hand Region Detection for Kinect-Based Hand Shape Recognition (Kinect 기반 손 모양 인식을 위한 손 영역 검출에 관한 연구)

  • Park, Hanhoon;Choi, Junyeong;Park, Jong-Il;Moon, Kwang-Seok
    • Journal of Broadcast Engineering
    • /
    • v.18 no.3
    • /
    • pp.393-400
    • /
    • 2013
  • Hand shape recognition is a fundamental technique for implementing natural human-computer interaction. In this paper, we discuss a method for effectively detecting a hand region in Kinect-based hand shape recognition. Since Kinect is a camera that can capture color images and infrared images (or depth images) together, both images can be exploited for the process of detecting a hand region. That is, a hand region can be detected by finding pixels having skin colors or by finding pixels having a specific depth. Therefore, after analyzing the performance of each, we need a method of properly combining both to clearly extract the silhouette of hand region. This is because the hand shape recognition rate depends on the fineness of detected silhouette. Finally, through comparison of hand shape recognition rates resulted from different hand region detection methods in general environments, we propose a high-performance hand region detection method.

Histogram Based Hand Recognition System for Augmented Reality (증강현실을 위한 히스토그램 기반의 손 인식 시스템)

  • Ko, Min-Su;Yoo, Ji-Sang
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.7
    • /
    • pp.1564-1572
    • /
    • 2011
  • In this paper, we propose a new histogram based hand recognition algorithm for augmented reality. Hand recognition system makes it possible a useful interaction between an user and computer. However, there is difficulty in vision-based hand gesture recognition with viewing angle dependency due to the complexity of human hand shape. A new hand recognition system proposed in this paper is based on the features from hand geometry. The proposed recognition system consists of two steps. In the first step, hand region is extracted from the image captured by a camera and then hand gestures are recognized in the second step. At first, we extract hand region by deleting background and using skin color information. Then we recognize hand shape by determining hand feature point using histogram of the obtained hand region. Finally, we design a augmented reality system by controlling a 3D object with the recognized hand gesture. Experimental results show that the proposed algorithm gives more than 91% accuracy for the hand recognition with less computational power.