• Title/Summary/Keyword: Hand Tracking Technology

Search Result 86, Processing Time 0.024 seconds

Research on Development of VR Realistic Sign Language Education Content Using Hand Tracking and Conversational AI (Hand Tracking과 대화형 AI를 활용한 VR 실감형 수어 교육 콘텐츠 개발 연구)

  • Jae-Sung Chun;Il-Young Moon
    • Journal of Advanced Navigation Technology
    • /
    • v.28 no.3
    • /
    • pp.369-374
    • /
    • 2024
  • This study aims to improve the accessibility and efficiency of sign language education for both hearing impaired and non-deaf people. To this end, we developed VR realistic sign language education content that integrates hand tracking technology and conversational AI. Through this content, users can learn sign language in real time and experience direct communication in a virtual environment. As a result of the study, it was confirmed that this integrated approach significantly improves immersion in sign language learning and contributes to lowering the barriers to sign language learning by providing learners with a deeper understanding. This presents a new paradigm for sign language education and shows how technology can change the accessibility and effectiveness of education.

Real-Time Two Hands Tracking System

  • Liu, Nianjun;Lovell, Brian C.
    • Proceedings of the IEEK Conference
    • /
    • 2002.07c
    • /
    • pp.1491-1494
    • /
    • 2002
  • The paper introduces a novel system of two hands real-time tracking based on the unrestricted hand skin segmentation by multi color systems. After corer-based segmentation and pre-processing operation, a label set of regions is created to locate the two hands automatically. By the normalization, template matching is used to find out the left or right hand. An improved fast self-adaptive tracking algorithm is applied and Canny filter is used for hand detection.

  • PDF

Hand Tracking and Hand Gesture Recognition for Human Computer Interaction

  • Bai, Yu;Park, Sang-Yun;Kim, Yun-Sik;Jeong, In-Gab;Ok, Soo-Yol;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.14 no.2
    • /
    • pp.182-193
    • /
    • 2011
  • The aim of this paper is to present the methodology for hand tracking and hand gesture recognition. The detected hand and gesture can be used to implement the non-contact mouse. We had developed a MP3 player using this technology controlling the computer instead of mouse. In this algorithm, we first do a pre-processing to every frame which including lighting compensation and background filtration to reducing the adverse impact on correctness of hand tracking and hand gesture recognition. Secondly, YCbCr skin-color likelihood algorithm is used to detecting the hand area. Then, we used Continuously Adaptive Mean Shift (CAMSHIFT) algorithm to tracking hand. As the formula-based region of interest is square, the hand is closer to rectangular. We have improved the formula of the search window to get a much suitable search window for hand. And then, Support Vector Machines (SVM) algorithm is used for hand gesture recognition. For training the system, we collected 1500 hand gesture pictures of 5 hand gestures. Finally we have performed extensive experiment on a Windows XP system to evaluate the efficiency of the proposed scheme. The hand tracking correct rate is 96% and the hand gestures average correct rate is 95%.

A Long-Range Touch Interface for Interaction with Smart TVs

  • Lee, Jaeyeon;Kim, DoHyung;Kim, Jaehong;Cho, Jae-Il;Sohn, Joochan
    • ETRI Journal
    • /
    • v.34 no.6
    • /
    • pp.932-941
    • /
    • 2012
  • A powerful interaction mechanism is one of the key elements for the success of smart TVs, which demand far more complex interactions than traditional TVs. This paper proposes a novel interface based on the famous touch interaction model but utilizes long-range bare hand tracking to emulate touch actions. To satisfy the essential requirements of high accuracy and immediate response, the proposed hand tracking algorithm adopts a fast color-based tracker but with modifications to avoid the problems inherent to those algorithms. By using online modeling and motion information, the sensitivity to the environment can be greatly decreased. Furthermore, several ideas to solve the problems often encountered by users interacting with smart TVs are proposed, resulting in a very robust hand tracking algorithm that works superbly, even for users with sleeveless clothing. In addition, the proposed algorithm runs at a very high speed of 82.73 Hz. The proposed interface is confirmed to comfortably support most touch operations, such as clicks, swipes, and drags, at a distance of three meters, which makes the proposed interface a good candidate for interaction with smart TVs.

Markless System of Using Hand Tracking (핸드 트레킹을 이용한 Markerless 시스템)

  • Ban, Kyeong-Jin;Kim, Jong-Chan;Kim, Kyeong-Og;Kim, Eung-Kon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2010.05a
    • /
    • pp.683-685
    • /
    • 2010
  • Augmented reality is a technology for showing virtual objects overlapped with the real world as seen from the eyes of users. Studies are being conducted on effective communication methods using hand movements in augmented reality. Hand movements are an important means of human communication along with writing and voice. Space arrangement technology and hand tracking technology using markers to acquire virtual information in augmented reality have many limitations. This paper proposes tracking hand area for augmented object. The augmented objects change viewpoints by hand positions which improves the sense of three dimensions and immersion.

  • PDF

Enhanced Sign Language Transcription System via Hand Tracking and Pose Estimation

  • Kim, Jung-Ho;Kim, Najoung;Park, Hancheol;Park, Jong C.
    • Journal of Computing Science and Engineering
    • /
    • v.10 no.3
    • /
    • pp.95-101
    • /
    • 2016
  • In this study, we propose a new system for constructing parallel corpora for sign languages, which are generally under-resourced in comparison to spoken languages. In order to achieve scalability and accessibility regarding data collection and corpus construction, our system utilizes deep learning-based techniques and predicts depth information to perform pose estimation on hand information obtainable from video recordings by a single RGB camera. These estimated poses are then transcribed into expressions in SignWriting. We evaluate the accuracy of hand tracking and hand pose estimation modules of our system quantitatively, using the American Sign Language Image Dataset and the American Sign Language Lexicon Video Dataset. The evaluation results show that our transcription system has a high potential to be successfully employed in constructing a sizable sign language corpus using various types of video resources.

Mobile Robot Control using Hand Shape Recognition (손 모양 인식을 이용한 모바일 로봇제어)

  • Kim, Young-Rae;Kim, Eun-Yi;Chang, Jae-Sik;Park, Se-Hyun
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.45 no.4
    • /
    • pp.34-40
    • /
    • 2008
  • This paper presents a vision based walking robot control system using hand shape recognition. To recognize hand shapes, the accurate hand boundary needs to be tracked in image obtained from moving camera. For this, we use an active contour model-based tracking approach with mean shift which reduces dependency of the active contour model to location of initial curve. The proposed system is composed of four modules: a hand detector, a hand tracker, a hand shape recognizer and a robot controller. The hand detector detects a skin color region, which has a specific shape, as hand in an image. Then, the hand tracking is performed using an active contour model with mean shift. Thereafter the hand shape recognition is performed using Hue moments. To assess the validity of the proposed system we tested the proposed system to a walking robot, RCB-1. The experimental results show the effectiveness of the proposed system.

A Prototype Design for a Real-time VR Game with Hand Tracking Using Affordance Elements

  • Yu-Won Jeong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.5
    • /
    • pp.47-53
    • /
    • 2024
  • In this paper, we propose applying interactive technology in virtual environments to enhance interaction and immersion by inducing more natural movements in the gesture recognition process through the concept of affordance. A technique is proposed to recognize gestures most similar to actual hand movements by applying a line segment recognition algorithm, incorporating sampling and normalization processes in the gesture recognition process. This line segment recognition was applied to the drawing of magic circles in the <VR Spell> game implemented in this paper. The experimental method verified the recognition rates for four line segment recognition actions. This paper aims to propose a VR game that pursues greater immersion and fun for the user through real-time hand tracking technology using affordance Elements, applied to immersive content in virtual environments such as VR games.

A Study on Target Tracking Filter Architecture in Underwater Environment using Active and Passive Sensors (능, 수동센서를 이용한 수중환경에서의 표적추적필터 구조 연구)

  • Lim, Youngtaek;Suh, Taeil
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.18 no.5
    • /
    • pp.517-524
    • /
    • 2015
  • In this paper, we propose a new target tracking filter architecture using active and passive sensors in underwater environment. A passive sensor for target tracking needs a bearing measurement of target. And target tracking filter for using passive sensor has the observability problem. On the other hand, an active sensor does not have the problem associated with system observability problem because an active sensor uses bearing and range measurement. In this paper, the tracking filter algorithm that could be used in the active and passive sensor system is proposed to analyze maneuvering target and to improve target tracking performance. The proposed tracking filter algorithm is tested by a series of computer simulation runs and the results are analyzed and compared with existing algorithm.

Implementation of Gesture Interface for Projected Surfaces

  • Park, Yong-Suk;Park, Se-Ho;Kim, Tae-Gon;Chung, Jong-Moon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.1
    • /
    • pp.378-390
    • /
    • 2015
  • Image projectors can turn any surface into a display. Integrating a surface projection with a user interface transforms it into an interactive display with many possible applications. Hand gesture interfaces are often used with projector-camera systems. Hand detection through color image processing is affected by the surrounding environment. The lack of illumination and color details greatly influences the detection process and drops the recognition success rate. In addition, there can be interference from the projection system itself due to image projection. In order to overcome these problems, a gesture interface based on depth images is proposed for projected surfaces. In this paper, a depth camera is used for hand recognition and for effectively extracting the area of the hand from the scene. A hand detection and finger tracking method based on depth images is proposed. Based on the proposed method, a touch interface for the projected surface is implemented and evaluated.