• 제목/요약/키워드: Hand Gesture

Search Result 403, Processing Time 0.035 seconds

Hand Gesture Sequence Recognition using Morphological Chain Code Edge Vector (형태론적 체인코드 에지벡터를 이용한 핸드 제스처 시퀀스 인식)

  • Lee Kang-Ho;Choi Jong-Ho
    • Journal of the Korea Society of Computer and Information
    • /
    • v.9 no.4 s.32
    • /
    • pp.85-91
    • /
    • 2004
  • The use of gestures provides an attractive alternate to cumbersome interface devices for human-computer interaction. This has motivated a very active research area concerned with computer vision-based analysis and interpretation of hand gestures The most important issues in gesture recognition are the simplification of algorithm and the reduction of processing time. The mathematical morphology based on geometrical set theory is best used to perform the processing. The key idea of proposed algorithm is to track a trajectory of center points in primitive elements extracted by morphological shape decomposition. The trajectory of morphological center points includes the information on shape orientation. Based on this characteristic we proposed the morphological gesture sequence recognition algorithm using feature vectors calculated to the trajectory of morphological center points. Through the experiment, we demonstrated the efficiency of proposed algorithm.

  • PDF

Virtual Fitting Development Based on Hand Gesture Recognition (손동작 인식 기반 Virtual Fitting 개발)

  • Kim, Seung-Yeon;Yu, Min-Ji;Jo, Ha-Jung;Jung, Seung-Won
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2019.05a
    • /
    • pp.596-598
    • /
    • 2019
  • 손동작 인식을 기반으로 한 Virtual fitting 시스템은 Kinect Sensor 를 사용하여 자연스러운 Fitting 을 구현할 수 있다. Kinect Sensor 를 이용한 Pose estimation, Gesture recognition, Virtual fitting 을 구현함으로써 가상으로 의복을 착용하는 소프트웨어를 소개한다.

Implementing Leap-Motion-Based Interface for Enhancing the Realism of Shooter Games (슈팅 게임의 현실감 개선을 위한 립모션 기반 인터페이스 구현)

  • Shin, Inho;Cheon, Donghun;Park, Hanhoon
    • Journal of the HCI Society of Korea
    • /
    • v.11 no.1
    • /
    • pp.5-10
    • /
    • 2016
  • This paper aims at providing a shooter game interface which enhances the game's realism by recognizing user's hand gestures using the Leap Motion. In this paper, we implemented the functions such as shooting, moving, viewpoint change, and zoom in/out, which are necessary in shooter games, and confirmed through user test that the game interface using familiar and intuitive hand gestures is superior to the conventional mouse/keyboard in terms of ease-to-manipulation, interest, extendability, and so on. Specifically, the user satisfaction index(1~5) was 3.02 on average when using the mouse/keyboard interface and 3.57 on average when using the proposed hand gesture interface.

A Prototype Design for a Real-time VR Game with Hand Tracking Using Affordance Elements

  • Yu-Won Jeong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.5
    • /
    • pp.47-53
    • /
    • 2024
  • In this paper, we propose applying interactive technology in virtual environments to enhance interaction and immersion by inducing more natural movements in the gesture recognition process through the concept of affordance. A technique is proposed to recognize gestures most similar to actual hand movements by applying a line segment recognition algorithm, incorporating sampling and normalization processes in the gesture recognition process. This line segment recognition was applied to the drawing of magic circles in the <VR Spell> game implemented in this paper. The experimental method verified the recognition rates for four line segment recognition actions. This paper aims to propose a VR game that pursues greater immersion and fun for the user through real-time hand tracking technology using affordance Elements, applied to immersive content in virtual environments such as VR games.

A Study on Hand-signal Recognition System in 3-dimensional Space (3차원 공간상의 수신호 인식 시스템에 대한 연구)

  • 장효영;김대진;김정배;변증남
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.41 no.3
    • /
    • pp.103-114
    • /
    • 2004
  • This paper deals with a system that is capable of recognizing hand-signals in 3-dimensional space. The system uses 2 color cameras as input devices. Vision-based gesture recognition system is known to be user-friendly because of its contact-free characteristic. But as with other applications using a camera as an input device, there are difficulties under complex background and varying illumination. In order to detect hand region robustly from a input image under various conditions without any special gloves or markers, the paper uses previous position information and adaptive hand color model. The paper defines a hand-signal as a combination of two basic elements such as 'hand pose' and 'hand trajectory'. As an extensive classification method for hand pose, the paper proposes 2-stage classification method by using 'small group concept'. Also, the paper suggests a complementary feature selection method from images from two color cameras. We verified our method with a hand-signal application to our driving simulator.

Visual Touchless User Interface for Window Manipulation (윈도우 제어를 위한 시각적 비접촉 사용자 인터페이스)

  • Kim, Jin-Woo;Jung, Kyung-Boo;Jeong, Seung-Do;Choi, Byung-Uk
    • Journal of KIISE:Software and Applications
    • /
    • v.36 no.6
    • /
    • pp.471-478
    • /
    • 2009
  • Recently, researches for user interface are remarkably processed due to the explosive growth of 3-dimensional contents and applications, and the spread class of computer user. This paper proposes a novel method to manipulate windows efficiently using only the intuitive motion of hand. Previous methods have some drawbacks such as burden of expensive device, high complexity of gesture recognition, assistance of additional information using marker, and so on. To improve the defects, we propose a novel visual touchless interface. First, we detect hand region using hue channel in HSV color space to control window using hand. The distance transform method is applied to detect centroid of hand and curvature of hand contour is used to determine position of fingertips. Finally, by using the hand motion information, we recognize hand gesture as one of predefined seven motions. Recognized hand gesture is to be a command to control window. In the proposed method, user can manipulate windows with sense of depth in the real environment because the method adopts stereo camera. Intuitive manipulation is also available because the proposed method supports visual touch for the virtual object, which user want to manipulate, only using simple motions of hand. Finally, the efficiency of the proposed method is verified via an application based on our proposed interface.

Hand Motion Recognition Algorithm Using Skin Color and Center of Gravity Profile (피부색과 무게중심 프로필을 이용한 손동작 인식 알고리즘)

  • Park, Youngmin
    • The Journal of the Convergence on Culture Technology
    • /
    • v.7 no.2
    • /
    • pp.411-417
    • /
    • 2021
  • The field that studies human-computer interaction is called HCI (Human-computer interaction). This field is an academic field that studies how humans and computers communicate with each other and recognize information. This study is a study on hand gesture recognition for human interaction. This study examines the problems of existing recognition methods and proposes an algorithm to improve the recognition rate. The hand region is extracted based on skin color information for the image containing the shape of the human hand, and the center of gravity profile is calculated using principal component analysis. I proposed a method to increase the recognition rate of hand gestures by comparing the obtained information with predefined shapes. We proposed a method to increase the recognition rate of hand gestures by comparing the obtained information with predefined shapes. The existing center of gravity profile has shown the result of incorrect hand gesture recognition for the deformation of the hand due to rotation, but in this study, the center of gravity profile is used and the point where the distance between the points of all contours and the center of gravity is the longest is the starting point. Thus, a robust algorithm was proposed by re-improving the center of gravity profile. No gloves or special markers attached to the sensor are used for hand gesture recognition, and a separate blue screen is not installed. For this result, find the feature vector at the nearest distance to solve the misrecognition, and obtain an appropriate threshold to distinguish between success and failure.

Design of Hand Gestures for Smart Home Appliances based on a User Centered Approach (스마트홈 내 기기와의 상호작용을 위한 사용자 중심의 핸드 제스처 도출)

  • Choi, Eun-Jung;Kwon, Sung-Hyuk;Lee, Dong-Hun;Lee, Ho-Jin;Chung, Min-K.
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.38 no.3
    • /
    • pp.182-190
    • /
    • 2012
  • With the progress of both wire and wireless home networking technology, various projects on smart home have been carried out in the world (Harper, 2003), and at the same time, new approaches to interact with smart home systems efficiently and effectively have also been investigated. A gesture-based interface is one of these approaches. Especially with advance of gesture recognition technologies, a variety of research studies on gesture interactions with the functions of IT devices have been conducted. However, there are few research studies which suggested and investigated the use of gestures for controlling smart home appliances. In this research the gestures for selected smart home appliances are suggested based on a user centered approach. A total of thirty-eight functions were selected, and a total of thirty participants generated gestures for each function. Based on the Nielsen (2004), Lee et al. (2010) and Kuhnel et al. (2011), the gesture with the highest frequency for each function (Top gesture) has been suggested and investigated.

A Comparison of the Characteristics between Single and Double Finger Gestures for Web Browsers

  • Park, Jae-Kyu;Lim, Young-Jae;Jung, Eui-S.
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.5
    • /
    • pp.629-636
    • /
    • 2012
  • Objective: The purpose of this study is to compare the characteristics of single and double finger gestures related on the web browser and to extract the appropriate finger gestures. Background: As electronic equipment emphasizes miniaturization for improving portability various interfaces are being developed as input devices. Electronic devices are made smaller, the gesture recognition technology using the touch-based interface is favored for easy editing. In addition, user focus primarily on the simplicity of intuitive interfaces which propels further research of gesture based interfaces. In particular, the fingers in these intuitive interfaces are simple and fast which are users friendly. Recently, the single and double finger gestures are becoming more popular so more applications for these gestures are being developed. However, systems and software that employ such finger gesture lack consistency in addition to having unclear standard and guideline development. Method: In order to learn the application of these gestures, we performed the sketch map method which happens to be a method for memory elicitation. In addition, we used the MIMA(Meaning in Mediated Action) method to evaluate gesture interface. Results: This study created appropriate gestures for intuitive judgment. We conducted a usability test which consisted of single and double finger gestures. The results showed that double finger gestures had less performance time faster than single finger gestures. Single finger gestures are a wide satisfaction difference between similar type and difference type. That is, single finger gestures can judge intuitively in a similar type but it is difficult to associate functions in difference type. Conclusion: This study was found that double finger gesture was effective to associate functions for web navigations. Especially, this double finger gesture could be effective on associating complex forms such as curve shaped gestures. Application: This study aimed to facilitate the design products which utilized finger and hand gestures.

A Study of Pattern-based Gesture Interaction in Tabletop Environments (테이블탑 환경에서 패턴 기반의 제스처 인터렉션 방법 연구)

  • Kim, Gun-Hee;Cho, Hyun-Chul;Pei, Wen-Hua;Ha, Sung-Do;Park, Ji-Hyung
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.696-700
    • /
    • 2009
  • In this paper, we present a framework which enables users to interact naturally with hand gestures on a digital table. In general tabletop applications, one gesture is mapped to one function or command. Therefore, users should know these relations, and make predefined gestures as input. In contrast, users can make input gesture without cognitive load in our system. Instead of burdening users, the system possesses knowledge about gesture interaction, and infers proactively users' gestures and intentions. When users make a gesture on the digital surface, the system begins to analyze the gestures and designs the response according to users' intention.

  • PDF