• Title/Summary/Keyword: hands tracking

Search Result 53, Processing Time 0.026 seconds

Vision and Depth Information based Real-time Hand Interface Method Using Finger Joint Estimation (손가락 마디 추정을 이용한 비전 및 깊이 정보 기반 손 인터페이스 방법)

  • Park, Kiseo;Lee, Daeho;Park, Youngtae
    • Journal of Digital Convergence
    • /
    • v.11 no.7
    • /
    • pp.157-163
    • /
    • 2013
  • In this paper, we propose a vision and depth information based real-time hand gesture interface method using finger joint estimation. For this, the areas of left and right hands are segmented after mapping of the visual image and depth information image, and labeling and boundary noise removal is performed. Then, the centroid point and rotation angle of each hand area are calculated. Afterwards, a circle is expanded at following pattern from a centroid point of the hand to detect joint points and end points of the finger by obtaining the midway points of the hand boundary crossing and the hand model is recognized. Experimental results that our method enabled fingertip distinction and recognized various hand gestures fast and accurately. As a result of the experiment on various hand poses with the hidden fingers using both hands, the accuracy showed over 90% and the performance indicated over 25 fps. The proposed method can be used as a without contacts input interface in HCI control, education, and game applications.

Behaviors of hand washing practice Korean adolescents, 2011-2013: The Korea Youth Risk Behavior Web-based Survey (청소년의 손 씻기 실천 행태 분석; 청소년 건강행태 온라인 조사 2011-2013년도를 중심으로)

  • Choi, Young-Sil
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.15 no.7
    • /
    • pp.4132-4138
    • /
    • 2014
  • The purpose of this assessment was to provide the basic data for setting up education in terms of 'Hand wash' as one of the health plan & education programs for adolescents. The task analyzed the behavior of students regarding hand washing, which were ranged from middle school to high school. The SPSS 18.0 statistical program, frequency-test and cross-analysis were used for data analysis by 2011, 2012 and 2013, which were the recent 3 years, the Korea Youth Risk Behavior Web-based Survey data. In the data, the response of "Never washed" from students before having a meal accounted for 29.4% in 2011, 30.5% in 2012 and 18.5% in 2013, respectively. Unlike other subjects, these facts suggest that this kind of behavior should be considered significant under the assessment. By tracking the trend over three years, some facts were confirmed in that students living in the metropolitan and medium-sized cities were less likely to wash their hands than students in small- sized towns. In terms of gender, female students were less likely to wash their hands than male students. Regarding the type of school, more students in the public middle & high schools had a tendency to respond "Never hand wash" than the students in the special-purpose high schools. Furthermore, as the grade was increased in middle school and high school, students were less likely to wash their hands before meals in school. Therefore, Health promotion and health education for students should be conducted more carefully with more emphasis on this point.

Robust Head Tracking using a Hybrid of Omega Shape Tracker and Face Detector for Robot Photographer (로봇 사진사를 위한 오메가 형상 추적기와 얼굴 검출기 융합을 이용한 강인한 머리 추적)

  • Kim, Ji-Sung;Joung, Ji-Hoon;Ho, An-Kwang;Ryu, Yeon-Geol;Lee, Won-Hyung;Jin, Chung-Myung
    • The Journal of Korea Robotics Society
    • /
    • v.5 no.2
    • /
    • pp.152-159
    • /
    • 2010
  • Finding a head of a person in a scene is very important for taking a well composed picture by a robot photographer because it depends on the position of the head. So in this paper, we propose a robust head tracking algorithm using a hybrid of an omega shape tracker and local binary pattern (LBP) AdaBoost face detector for the robot photographer to take a fine picture automatically. Face detection algorithms have good performance in terms of finding frontal faces, but it is not the same for rotated faces. In addition, when the face is occluded by a hat or hands, it has a hard time finding the face. In order to solve this problem, the omega shape tracker based on active shape model (ASM) is presented. The omega shape tracker is robust to occlusion and illuminationchange. However, whenthe environment is dynamic,such as when people move fast and when there is a complex background, its performance is unsatisfactory. Therefore, a method combining the face detection algorithm and the omega shape tracker by probabilistic method using histograms of oriented gradient (HOG) descriptor is proposed in this paper, in order to robustly find human head. A robot photographer was also implemented to abide by the 'rule of thirds' and to take photos when people smile.

B-COV:Bio-inspired Virtual Interaction for 3D Articulated Robotic Arm for Post-stroke Rehabilitation during Pandemic of COVID-19

  • Allehaibi, Khalid Hamid Salman;Basori, Ahmad Hoirul;Albaqami, Nasser Nammas
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.2
    • /
    • pp.110-119
    • /
    • 2021
  • The Coronavirus or COVID-19 is contagiousness virus that infected almost every single part of the world. This pandemic forced a major country did lockdown and stay at a home policy to reduce virus spread and the number of victims. Interactions between humans and robots form a popular subject of research worldwide. In medical robotics, the primary challenge is to implement natural interactions between robots and human users. Human communication consists of dynamic processes that involve joint attention and attracting each other. Coordinated care involves sharing among agents of behaviours, events, interests, and contexts in the world from time to time. The robotics arm is an expensive and complicated system because robot simulators are widely used instead of for rehabilitation purposes in medicine. Interaction in natural ways is necessary for disabled persons to work with the robot simulator. This article proposes a low-cost rehabilitation system by building an arm gesture tracking system based on a depth camera that can capture and interpret human gestures and use them as interactive commands for a robot simulator to perform specific tasks on the 3D block. The results show that the proposed system can help patients control the rotation and movement of the 3D arm using their hands. The pilot testing with healthy subjects yielded encouraging results. They could synchronize their actions with a 3D robotic arm to perform several repetitive tasks and exerting 19920 J of energy (kg.m2.S-2). The average of consumed energy mentioned before is in medium scale. Therefore, we relate this energy with rehabilitation performance as an initial stage and can be improved further with extra repetitive exercise to speed up the recovery process.

Implementation of a Mobile App for Companion Dog Training using AR and Hand Tracking (AR 및 Hand Tracking을 활용한 반려견 훈련 모바일 앱 구현)

  • Chul-Ho Choi;Sung-Wook Park;Se-Hoon Jung;Chun-Bo Sim
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.18 no.5
    • /
    • pp.927-934
    • /
    • 2023
  • With the recent growth of the companion animal market, various social issues related to companion animals have also come to the forefront. Notable problems include incidents of dog bites, the challenge of managing abandoned companion animals, euthanasia, animal abuse, and more. As potential solutions, a variety of training programs such as companion animal-focused broadcasts and educational apps are being offered. However, these options might not be very effective for novice caretakers who are uncertain about what to prioritize in training. While training apps that are relatively easy to access have been widely distributed, apps that allow users to directly engage in training and learn through hands-on experience are still insufficient. In this paper, we propose a more efficient AR-based mobile app for companion animal training, utilizing the Unity engine. The results of usability evaluations indicated increased user engagement due to the inclusion of elements that were previously absent. Moreover, training immersion was enhanced, leading to improved learning outcomes. With further development and subsequent verification and production, we anticipate that this app could become an effective training tool for novice caretakers planning to adopt companion animals, as well as for experienced caretakers.

Variability of Practice Effects in Transfer of Photoelectric Rotary Pursuit Task

  • Jeon, Hye-Seon
    • Physical Therapy Korea
    • /
    • v.12 no.4
    • /
    • pp.7-11
    • /
    • 2005
  • The purposes of this study were to investigate the effects of variability of training on the acquisition of motor skill of closed loop type tracking task using Rotary Pursuit, and to determine if there was a bilateral transfer effect to the non-dominant hand following practice with the dominant hand. Twelve healthy volunteer students (5 males and 7 females, aged 25 to 37) were randomly divided into a constant practice group and a variable practice group. A photoelectric rotary pursuit apparatus with stop clock and repeat cycle timer by Lafayette Instrumentation Co. was used for this study. Rotary pursuit is a closed loop task in which a subject attempts to keep a photoelectric stylus on a lighted target in motion. Subjects performed the clockwise circular pursuit task while standing. Experimental procedure was divided into three sessions, namely, pre-test, training, and post-test. The constant group practiced all 60 trials at 30 rpm. Variable practice group did a varied practice session with 15 trials at speeds of 20 rpm, 26 rpm, 34 rpm, and 46 rpm. No one in either group practiced with their non-dominant arm. A Mann-Whitney test and a Wilcoxon Signed Ranks test were used for statistical analyses. The results of this study showed no different training effect between groups on the post-test with the dominant hand. However, bilateral transfer effect of rotary pursuit task between hands was demonstrated. Possible mechanisms are discussed.

  • PDF

Tangible AR Interaction based on Fingertip Touch Using Small-Sized Markers (소형 마커를 이용한 손가락 터치 기반 감각형 증강현실 상호작용 방안)

  • Jung, Ho-Kyun;Park, Hyungjun
    • Korean Journal of Computational Design and Engineering
    • /
    • v.18 no.5
    • /
    • pp.374-383
    • /
    • 2013
  • Various interaction techniques have been studied for providing the feeling of touch and improve immersion in augmented reality (AR) environments. Tangible AR interaction exploiting two types (product-type and pointer-type) of simple objects has earned great interest for cost-effective design evaluation of digital handheld products. When the sizes of markers attached to the objects are kept big to obtain better marker recognition, the pointer-type object frequently and significantly occludes the product-type object, which deteriorates natural visualization and level of immersion in an AR environment. In this paper, in order to overcome such problems, we propose tangible AR interaction using fingertip touch combined with small-sized markers. The proposed approach facilitates the use of convex polygons to recover the boundaries of AR markers which are partially occluded. It also properly enlarges the pattern area of each AR marker to reduce the sizes of AR markers without sacrificing the quality of marker detection. We empirically verified the quality of the proposed approach, and applied it in the process of design evaluation of digital products. From experimental results, we found that the approach is comparably accurate enough to be applied to the design evaluation process and tangible enough to provide a pseudo feeling of manipulating virtual products with human hands.

NATURAL INTERACTION WITH VIRTUAL PET ON YOUR PALM

  • Choi, Jun-Yeong;Han, Jae-Hyek;Seo, Byung-Kuk;Park, Han-Hoon;Park, Jong-Il
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.341-345
    • /
    • 2009
  • We present an augmented reality (AR) application for cell phone where users put a virtual pet on their palms and play/interact with the pet by moving their hands and fingers naturally. The application is fundamentally based on hand/palm pose recognition and finger motion estimation, which is the main concern in this paper. We propose a fast and efficient hand/palm pose recognition method which uses natural features (e.g. direction, width, contour shape of hand region) extracted from a hand image with prior knowledge for hand shape or geometry (e.g. its approximated shape when a palm is open, length ratio between palm width and pal height). We also propose a natural interaction method which recognizes natural motion of fingers such as opening/closing palm based on fingertip tracking. Based on the proposed methods, we developed and tested the AR application on an ultra-mobile PC (UMPC).

  • PDF

Vision-Based Two-Arm Gesture Recognition by Using Longest Common Subsequence (최대 공통 부열을 이용한 비전 기반의 양팔 제스처 인식)

  • Choi, Cheol-Min;Ahn, Jung-Ho;Byun, Hye-Ran
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.33 no.5C
    • /
    • pp.371-377
    • /
    • 2008
  • In this paper, we present a framework for vision-based two-arm gesture recognition. To capture the motion information of the hands, we perform color-based tracking algorithm using adaptive kernel for each frame. And a feature selection algorithm is performed to classify the motion information into four different phrases. By using gesture phrase information, we build a gesture model which consists of a probability of the symbols and a symbol sequence which is learned from the longest common subsequence. Finally, we present a similarity measurement for two-arm gesture recognition by using the proposed gesture models. In the experimental results, we show the efficiency of the proposed feature selection method, and the simplicity and the robustness of the recognition algorithm.

A study imitating human auditory system for tracking the position of sound source (인간의 청각 시스템을 응용한 음원위치 추정에 관한 연구)

  • Bae, Jeen-Man;Cho, Sun-Ho;Park, Chong-Kuk
    • Proceedings of the KIEE Conference
    • /
    • 2003.11c
    • /
    • pp.878-881
    • /
    • 2003
  • To acquire an appointed speaker's clear voice signal from inspect-camera, picture-conference or hands free microphone eliminating interference noises needs to be preceded speaker's position automatically. Presumption of sound source position's basic algorithm is about measuring TDOA(Time Difference Of Arrival) from reaching same signals between two microphones. This main project uses ADF(Adaptive Delay Filter) [4] and CPS(Cross Power Spectrum) [5] which are one of the most important analysis of TDOA. From these analysis this project proposes presumption of real time sound source position and improved model NI-ADF which makes possible to presume both directions of sound source position. NI-ADF noticed that if auditory sense of humankind reaches above to some specified level in specified frequency, it will accept sound through activated nerve. NI-ADF also proposes practicable algorithm, the presumption of real time sound source position including both directions, that when microphone loads to some specified system, it will use sounds level difference from external system related to sounds of diffraction phenomenon. In accordance with the project, when existing both direction adaptation filter's algorithm measures sound source, it increases more than twice number by measuring one way. Preserving this weak point, this project proposes improved algorithm to presume real time in both directions.

  • PDF