• Title/Summary/Keyword: Hand Gesture

Search Result 404, Processing Time 0.024 seconds

Skin Color Based Hand and Finger Detection for Gesture Recognition in CCTV Surveillance (CCTV 관제에서 동작 인식을 위한 색상 기반 손과 손가락 탐지)

  • Kang, Sung-Kwan;Chung, Kyung-Yong;Rim, Kee-Wook;Lee, Jung-Hyun
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.10
    • /
    • pp.1-10
    • /
    • 2011
  • In this paper, we proposed the skin color based hand and finger detection technology for the gesture recognition in CCTV surveillance. The aim of this paper is to present the methodology for hand detection and propose the finger detection method. The detected hand and finger can be used to implement the non-contact mouse. This technology can be used to control the home devices such as home-theater and television. Skin color is used to segment the hand region from background and contour is extracted from the segmented hand. Analysis of contour gives us the location of finger tip in the hand. After detecting the location of the fingertip, this system tracks the fingertip by using only R channel alone, and in recognition of hand motions to apply differential image, such as the removal of useless image shows a robust side. We explain about experiment which relates in fingertip tracking and finger gestures recognition, and experiment result shows the accuracy above 96%.

A Study on User Interface for Quiz Game Contents using Gesture Recognition (제스처인식을 이용한 퀴즈게임 콘텐츠의 사용자 인터페이스에 대한 연구)

  • Ahn, Jung-Ho
    • Journal of Digital Contents Society
    • /
    • v.13 no.1
    • /
    • pp.91-99
    • /
    • 2012
  • In this paper we introduce a quiz application program that digitizes the analogue quiz game. We digitize the quiz components such as quiz proceeding, participants recognition, problem presentation, volunteer recognition who raises his hand first, answer judgement, score addition, winner decision, etc, which are manually performed in the normal quiz game. For automation, we obtained the depth images from the kinect camera which comes into the spotlight recently, so that we located the quiz participants and recognized the user-friendly defined gestures. Analyzing the depth distribution, we detected and segmented the upper body parts and located the hands' areas. Also, we extracted hand features and designed the decision function that classified the hand pose into palm, fist or else, so that a participant can select the example that he wants among presented examples. The implemented quiz application program was tested in real time and showed very satisfactory gesture recognition results.

Investigating Key User Experience Factors for Virtual Reality Interactions

  • Ahn, Junyoung;Choi, Seungho;Lee, Minjae;Kim, Kyungdoh
    • Journal of the Ergonomics Society of Korea
    • /
    • v.36 no.4
    • /
    • pp.267-280
    • /
    • 2017
  • Objective: The aim of this study is to investigate key user experience factors of interactions for Head Mounted Display (HMD) devices in the Virtual Reality Environment (VRE). Background: Virtual reality interaction research has been conducted steadily, while interaction methods and virtual reality devices have improved. Recently, all of the virtual reality devices are head mounted display based ones. Also, HMD-based interaction types include Remote Controller, Head Tracking, and Hand Gesture. However, there is few study on usability evaluation of virtual reality. Especially, the usability of HMD-based virtual reality was not investigated. Therefore, it is necessary to study the usability of HMD-based virtual reality. Method: HMD-based VR devices released recently have only three interaction types, 'Remote Controller', 'Head Tracking', and 'Hand Gesture'. We search 113 types of research to check the user experience factors or evaluation scales by interaction type. Finally, the key user experience factors or relevant evaluation scales are summarized considering the frequency used in the studies. Results: There are various key user experience factors by each interaction type. First, Remote controller's key user experience factors are 'Ease of learning', 'Ease of use', 'Satisfaction', 'Effectiveness', and 'Efficiency'. Also, Head tracking's key user experience factors are 'Sickness', 'Immersion', 'Intuitiveness', 'Stress', 'Fatigue', and 'Ease of learning'. Finally, Hand gesture's key user experience factors are 'Ease of learning', 'Ease of use', 'Feedback', 'Consistent', 'Simple', 'Natural', 'Efficiency', 'Responsiveness', 'Usefulness', 'Intuitiveness', and 'Adaptability'. Conclusion: We identified key user experience factors for each interaction type through literature review. However, we did not consider objective measures because each study adopted different performance factors. Application: The results of this study can be used when evaluating HMD-based interactions in virtual reality in terms of usability.

Study on the Hand Gesture Recognition System and Algorithm based on Millimeter Wave Radar (밀리미터파 레이더 기반 손동작 인식 시스템 및 알고리즘에 관한 연구)

  • Lee, Youngseok
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.12 no.3
    • /
    • pp.251-256
    • /
    • 2019
  • In this paper we proposed system and algorithm to recognize hand gestures based on the millimeter wave that is in 65GHz bandwidth. The proposed system is composed of millimeter wave radar board, analog to data conversion and data capture board and notebook to perform gesture recognition algorithms. As feature vectors in proposed algorithm. we used global and local zernike moment descriptor which are robust to distort by rotation of scaling of 2D data. As Experimental result, performance of the proposed algorithm is evaluated and compared with those of algorithms using single global or local zernike descriptor as feature vectors. In analysis of confusion matrix of algorithms, the proposed algorithm shows the better performance in comparison of precision, accuracy and sensitivity, subsequently total performance index of our method is 95.6% comparing with another two mehods in 88.4% and 84%.

Gadget Arms: Interactive Data Visualization using Hand Gesture in Extended Reality (가젯암: 확장현실을 위한 손 제스처 기반 대화형 데이터 시각화 시스템)

  • Choi, JunYoung;Jeong, HaeJin;Jeong, Won-Ki
    • Journal of the Korea Computer Graphics Society
    • /
    • v.25 no.2
    • /
    • pp.31-41
    • /
    • 2019
  • Extended Reality (XR), such as virtual and augmented reality, has huge potential for immersive data visualization and analysis. In XR, users can interact with data and other users realistically by navigating the shared virtual space, allowing for more intuitive data analysis. However, creating a visualization in XR also poses a challenge because complicated, low-level programming is required, which hinders broad adaptation in visual analytics. This paper proposes an interactive visualization authoring tool based on hand gesture for immersive data visualization-Gadget Arms. The proposed system provides a novel user interaction to create and place visualization in the 3D virtual world. This simple, but intuitive, user interaction enables user designs the entire visualization space in the XR without using a host computer and low-level programming. Our user study also confirmed that the proposed user interaction significantly improves the usability of the visualization authoring tool.

(A Comparison of Gesture Recognition Performance Based on Feature Spaces of Angle, Velocity and Location in HMM Model) (HMM인식기 상에서 방향, 속도 및 공간 특징량에 따른 제스처 인식 성능 비교)

  • 윤호섭;양현승
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.5_6
    • /
    • pp.430-443
    • /
    • 2003
  • The objective of this paper is to evaluate most useful feature vector space using the angle, velocity and location features from gesture trajectory which extracted hand regions from consecutive input images and track them by connecting their positions. For this purpose, the gesture tracking algorithm using color and motion information is developed. The recognition module is a HMM model to adaptive time various data. The proposed algorithm was applied to a database containing 4,800 alphabetical handwriting gestures of 20 persons who was asked to draw his/her handwriting gestures five times for each of the 48 characters.

A Development of the Next-generation Interface System Based on the Finger Gesture Recognizing in Use of Image Process Techniques (영상처리를 이용한 지화인식 기반의 차세대 인터페이스 시스템 개발)

  • Kim, Nam-Ho
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.4
    • /
    • pp.935-942
    • /
    • 2011
  • This study aims to design and implement the finger gesture recognizing system that automatically recognizes finger gestures input through a camera and controls the computer. Common CCD cameras were redesigned as infrared light cameras to acquire the images. The recorded images go through the pre-process to find the hand features, the finger gestures are read accordingly, and an event takes place for the follow-up mouse controlling and presentation, and finally the way to control computers is suggested. The finger gesture recognizing system presented in this study has been verified as the next-generation interface to replace the mouse and keyboard for the future information-based units.

Android Platform based Gesture Recognition using Smart Phone Sensor Data (안드로이드 플랫폼기반 스마트폰 센서 정보를 활용한 모션 제스처 인식)

  • Lee, Yong Cheol;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.1 no.4
    • /
    • pp.18-26
    • /
    • 2012
  • The increase of the number of smartphone applications has enforced the importance of new user interface emergence and has raised the interest of research in the convergence of multiple sensors. In this paper, we propose a method for the convergence of acceleration, magnetic and gyro sensors to recognize the gesture from motion of user smartphone. The proposed method first obtain the 3D orientation of smartphone and recognize the gesture of hand motion by using HMM(Hidden Markov Model). The proposed method for the representation for 3D orientation of smartphone in spherical coordinate was used for quantization of smartphone orientation to be more sensitive in rotation axis. The experimental result shows that the success rate of our method is 93%.

  • PDF

Extracting Flick Operator for Predicting Performance by GOMS Model in Small Touch Screen

  • Choi, Mikyung;Lee, Bong Geun;Oh, Hyungseok;Myung, Rohae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.32 no.2
    • /
    • pp.179-187
    • /
    • 2013
  • Objective: The purpose of this study is to extract GOMS manual operator, except for an experiment with participants. Background: The GOMS model has advantage of rapid modeling which is suitable for the environment of technology development which has a short life cycle products with a fast pace. The GOMS model was originally designed for desktop environment so that it is not adequate for implementing into the latest HCI environment such as small touch screen device. Therefore, this research proposed GOMS manual operator extraction methodology which is excluded experimental method. And flick Gesture was selected to explain application of proposed methodology to extract new operator. Method: Divide into start to final step of hand gesture needed to extract as an operator through gesture task analysis. Then apply the original GOMS operator to each similar step of gesture and modify the operator for implementation stage based on existing Fitts' law research. Steps that are required to move are modified based on the Fitts' law developed in touch screen device. Finally, new operator can be derived from using these stages and a validation experiment, performed to verify the validity of new operator and methodology by comparing human performance. Results: The average movement times of the participants' performance and the operator which is extracted in case study are not different significantly. Also the average of movement times of each type of view study is not different significantly. Conclusion: In conclusion, the result of the proposed methodology for extracting new operator is similar to the result of the experiment with their participants. Furthermore the GOMS model included the operator by the proposed methodology in this research could be applied successfully to predict the user's performance. Application: Using this methodology could be applied to develop new finger gesture in the touch screen. Also this proposed methodology could be applied to evaluate the usability of certain system rapidly including the new finger gesture performance.

Hand posture recognition robust to rotation using temporal correlation between adjacent frames (인접 프레임의 시간적 상관 관계를 이용한 회전에 강인한 손 모양 인식)

  • Lee, Seong-Il;Min, Hyun-Seok;Shin, Ho-Chul;Lim, Eul-Gyoon;Hwang, Dae-Hwan;Ro, Yong-Man
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.11
    • /
    • pp.1630-1642
    • /
    • 2010
  • Recently, there is an increasing need for developing the technique of Hand Gesture Recognition (HGR), for vision based interface. Since hand gesture is defined as consecutive change of hand posture, developing the algorithm of Hand Posture Recognition (HPR) is required. Among the factors that decrease the performance of HPR, we focus on rotation factor. To achieve rotation invariant HPR, we propose a method that uses the property of video that adjacent frames in video have high correlation, considering the environment of HGR. The proposed method introduces template update of object tracking using the above mentioned property, which is different from previous works based on still images. To compare our proposed method with previous methods such as template matching, PCA and LBP, we performed experiments with video that has hand rotation. The accuracy rate of the proposed method is 22.7%, 14.5%, 10.7% and 4.3% higher than ordinary template matching, template matching using KL-Transform, PCA and LBP, respectively.