• 제목/요약/키워드: Gesture segmentation

검색결과 41건 처리시간 0.02초

퀴즈게임의 체감형 제스처 인터페이스 프로토타입 개발 (A Study on Tangible Gesture Interface Prototype Development of the Quiz Game)

  • 안정호;고재필
    • 디지털콘텐츠학회 논문지
    • /
    • 제13권2호
    • /
    • pp.235-245
    • /
    • 2012
  • 우리는 본 논문에서 사용자 제스처 인터페이스 기반 퀴즈게임 콘텐츠를 제안한다. 우리는 기존의 아날로그 방식으로 수행해 오던 퀴즈게임의 요소들을 파악하여 디지털화함으로써 퀴즈 진행자의 역할을 콘텐츠 프로그램이 담당할 수 있도록 하였다. 우리는 키넥트 카메라를 사용하여 깊이영상을 획득하고 깊이영상에서 사용자 분할, 머리 위치 검출 및 추적, 손 검출 등의 전처리 작업과 손들기, 손 상하이동, 주먹 모양, 패스, 주먹 쥐고 당김 등의 명령형 손 제스처 인식기술을 개발하였다. 특히 우리는 사람이 일상생활에서 물리적인 객체를 조작하는 동작으로 인터페이스를 위한 제스처를 정의함으로써 사용자가 이동, 선택, 확인 등의 추상적인 개념을 인터페이스 과정에서 체감할 수 있도록 디자인하였다. 앞서 발표되었던 선행 작업과 비교할 때, 우리는 승리 팀에 대한 카드보상 절차를 추가하여 콘텐츠의 완성도를 높였으며, 손 상하이동 인식과 주먹 모양 인식 알고리즘 등을 개선하여 문제 보기선택의 성능을 크게 향상시켰고, 체계적인 실험을 통해 만족할 만한 인식 성능을 입증하였다. 구현된 콘텐츠는 실시간 테스트에서 만족스러운 제스처 인식 결과를 보였으며 원활한 퀴즈게임 진행이 가능하였다.

인간의 행동 인식을 위한 얼굴 방향과 손 동작 해석 (Analysis of Face Direction and Hand Gestures for Recognition of Human Motion)

  • 김성은;조강현;전희성;최원호;박경섭
    • 제어로봇시스템학회논문지
    • /
    • 제7권4호
    • /
    • pp.309-318
    • /
    • 2001
  • In this paper, we describe methods that analyze a human gesture. A human interface(HI) system for analyzing gesture extracts the head and hand regions after taking image sequence of and operators continuous behavior using CCD cameras. As gestures are accomplished with operators head and hands motion, we extract the head and hand regions to analyze gestures and calculate geometrical information of extracted skin regions. The analysis of head motion is possible by obtaining the face direction. We assume that head is ellipsoid with 3D coordinates to locate the face features likes eyes, nose and mouth on its surface. If was know the center of feature points, the angle of the center in the ellipsoid is the direction of the face. The hand region obtained from preprocessing is able to include hands as well as arms. For extracting only the hand region from preprocessing, we should find the wrist line to divide the hand and arm regions. After distinguishing the hand region by the wrist line, we model the hand region as an ellipse for the analysis of hand data. Also, the finger part is represented as a long and narrow shape. We extract hand information such as size, position, and shape.

  • PDF

A Vision-Based Method to Find Fingertips in a Closed Hand

  • Chaudhary, Ankit;Vatwani, Kapil;Agrawal, Tushar;Raheja, J.L.
    • Journal of Information Processing Systems
    • /
    • 제8권3호
    • /
    • pp.399-408
    • /
    • 2012
  • Hand gesture recognition is an important area of research in the field of Human Computer Interaction (HCI). The geometric attributes of the hand play an important role in hand shape reconstruction and gesture recognition. That said, fingertips are one of the important attributes for the detection of hand gestures and can provide valuable information from hand images. Many methods are available in scientific literature for fingertips detection with an open hand but very poor results are available for fingertips detection when the hand is closed. This paper presents a new method for the detection of fingertips in a closed hand using the corner detection method and an advanced edge detection algorithm. It is important to note that the skin color segmentation methodology did not work for fingertips detection in a closed hand. Thus the proposed method applied Gabor filter techniques for the detection of edges and then applied the corner detection algorithm for the detection of fingertips through the edges. To check the accuracy of the method, this method was tested on a vast number of images taken with a webcam. The method resulted in a higher accuracy rate of detections from the images. The method was further implemented on video for testing its validity on real time image capturing. These closed hand fingertips detection would help in controlling an electro-mechanical robotic hand via hand gesture in a natural way.

컴퓨터 인터페이스를 위한 Hand Gesture 인식에 관한 연구 (A Study of Hand Gesture Recognition for Human Computer Interface)

  • 장호정;백한욱;정진현
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2000년도 하계학술대회 논문집 D
    • /
    • pp.3041-3043
    • /
    • 2000
  • GUI(graphical user interface) has been the dominant platform for HCI(human computer interaction). The GUI-based style of interaction has made computers simpler and easier to use. However GUI will not easily support the range of interaction necessary to meet users' needs that are natural, intuitive, and adaptive. In this paper we study an approach to track a hand in an image sequence and recognize it, in each video frame for replacing the mouse as a pointing device to virtual reality. An algorithm for real time processing is proposed by estimating of the position of the hand and segmentation, considering the orientation of motion and color distribution of hand region.

  • PDF

손 동작을 통한 인간과 컴퓨터간의 상호 작용 (Recognition of Hand gesture to Human-Computer Interaction)

  • 이래경;김성신
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2000년도 하계학술대회 논문집 D
    • /
    • pp.2930-2932
    • /
    • 2000
  • In this paper. a robust gesture recognition system is designed and implemented to explore the communication methods between human and computer. Hand gestures in the proposed approach are used to communicate with a computer for actions of a high degree of freedom. The user does not need to wear any cumbersome devices like cyber-gloves. No assumption is made on whether the user is wearing any ornaments and whether the user is using the left or right hand gestures. Image segmentation based upon the skin-color and a shape analysis based upon the invariant moments are combined. The features are extracted and used for input vectors to a radial basis function networks(RBFN). Our "Puppy" robot is employed as a testbed. Preliminary results on a set of gestures show recognition rates of about 87% on the a real-time implementation.

  • PDF

Emergency Signal Detection based on Arm Gesture by Motion Vector Tracking in Face Area

  • Fayyaz, Rabia;Park, Dae Jun;Rhee, Eun Joo
    • 한국정보전자통신기술학회논문지
    • /
    • 제12권1호
    • /
    • pp.22-28
    • /
    • 2019
  • This paper presents a method for detection of an emergency signal expressed by arm gestures based on motion segmentation and face area detection in the surveillance system. The important indicators of emergency can be arm gestures and voice. We define an emergency signal as the 'Help Me' arm gestures in a rectangle around the face. The 'Help Me' arm gestures are detected by tracking changes in the direction of the horizontal motion vectors of left and right arms. The experimental results show that the proposed method successfully detects 'Help Me' emergency signal for a single person and distinguishes it from other similar arm gestures such as hand waving for 'Bye' and stretching. The proposed method can be used effectively in situations where people can't speak, and there is a language or voice disability.

Interactive drawing with user's intentions using image segmentation

  • Lim, Sooyeon
    • International Journal of Internet, Broadcasting and Communication
    • /
    • 제10권3호
    • /
    • pp.73-80
    • /
    • 2018
  • This study introduces an interactive drawing system, a tool that allows user to sketch and draw with his own intentions. The proposed system enables the user to express more creatively through a tool that allows the user to reproduce his original idea as a drawing and transform it using his body. The user can actively participate in the production of the artwork by studying the unique formative language of the spectator. In addition, the user is given an opportunity to experience a creative process by transforming arbitrary drawing into various shapes according to his gestures. Interactive drawing systems use the segmentation of the drawing image as a way to extend the user's initial drawing idea. The system includes transforming a two-dimensional drawing into a volume-like form such as a three-dimensional drawing using image segmentation. In this process, a psychological space is created that can stimulate the imagination of the user and project the object of desire. This process of drawing personification plays a role of giving the user familiarity with the artwork and indirectly expressing his her emotions to others. This means that the interactive drawing, which has changed to the emotional concept of interaction beyond the concept of information transfer, can create a cooperative sensation image between user's time and space and occupy an important position in multimedia society.

저해상도 손 제스처 영상 인식에 대한 연구 (A Study on Hand Gesture Recognition with Low-Resolution Hand Images)

  • 안정호
    • 한국위성정보통신학회논문지
    • /
    • 제9권1호
    • /
    • pp.57-64
    • /
    • 2014
  • 최근 물리적 디바이스의 도움 없이 사람이 시스템과 인터랙션 할 수 있는 인간 친화적인 인간-기계 인터페이스가 많이 연구되고 있다. 이중 대표적인 것이 본 논문의 주제인 비전기반 제스처인식이다. 본 논문에서 우리는 설정된 가상세계의 객체와의 인터랙션을 위한 손 제스처들을 정의하고 이들을 인식할 수 있는 효과적인 방법론을 제안한다. 먼저, 웹캠으로 촬영된 저해상도 영상에서 사용자의 양손을 검출 및 추적하고, 손 영역을 분할하여 손 실루엣을 추출한다. 우리는 손 검출을 위해, RGB 공간에서 명암에 따라 두개의 타원형 모델을 이용하여 피부색을 모델링하였으며, 블랍매칭(blob matching) 방법을 이용하여 손 추적을 수행하였다. 우리는 플러드필(floodfill) 알고리즘을 이용해 얻은 손 실루엣의 행/열 모드 검출 및 분석을 통해 Thumb-Up, Palm, Cross 등 세 개의 손모양을 인식하였다. 그리고 인식된 손 모양과 손 움직임의 콘텍스트를 분석해서 다섯 가지 제스처를 인식할 수 있었다. 제안하는 제스처인식 방법론은 정확한 손 검출을 위해 카메라 앞에 주요 사용자가 한 명 등장한다는 가정을 하고 있으며 많은 실시간 데모를 통해 효율성 및 정확성이 입증되었다.

서비스 로봇을 위한 지시 물체 분할 방법 (Segmentation of Pointed Objects for Service Robots)

  • 김형오;김수환;김동환;박성기
    • 로봇학회논문지
    • /
    • 제4권2호
    • /
    • pp.139-146
    • /
    • 2009
  • This paper describes how a person extracts a unknown object with pointing gesture while interacting with a robot. Using a stereo vision sensor, our proposed method consists of two stages: the detection of the operators' face, the estimation of the pointing direction, and the extraction of the pointed object. The operator's face is recognized by using the Haar-like features. And then we estimate the 3D pointing direction from the shoulder-to-hand line. Finally, we segment an unknown object from 3D point clouds in estimated region of interest. On the basis of this proposed method, we implemented an object registration system with our mobile robot and obtained reliable experimental results.

  • PDF

Hue 영상을 기반한 손 영역 검출 및 추적 (Hand Region Segmentation and Tracking Based on Hue Image)

  • 권화중;이준호
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 1999년도 추계종합학술대회 논문집
    • /
    • pp.1003-1006
    • /
    • 1999
  • Hand segmentation and tracking is essential to the development of a hand gesture recognition system. This research features segementation and tracking of hand regions based the hue component of color. We propose a method that employs HSI color model, and segments and tracks hand regions using the hue component of color alone. In order to track the segmented hand regions, we only apply Kalman filter to a region of interest represented by a rectangle region. Initial experimental results show that the system accurately segments and tracks hand regions although it only uses the hue compoent of color. The system yields near real time throghput of 8 frames per second on a Pentium II 233MHz PC.

  • PDF