• Title/Summary/Keyword: Hand Motion

Search Result 912, Processing Time 0.034 seconds

Hand Gesture Recognition with Convolution Neural Networks for Augmented Reality Cognitive Rehabilitation System Based on Leap Motion Controller (립모션 센서 기반 증강현실 인지재활 훈련시스템을 위한 합성곱신경망 손동작 인식)

  • Song, Keun San;Lee, Hyun Ju;Tae, Ki Sik
    • Journal of Biomedical Engineering Research
    • /
    • v.42 no.4
    • /
    • pp.186-192
    • /
    • 2021
  • In this paper, we evaluated prediction accuracy of Euler angle spectrograph classification method using a convolutional neural networks (CNN) for hand gesture recognition in augmented reality (AR) cognitive rehabilitation system based on Leap Motion Controller (LMC). Hand gesture recognition methods using a conventional support vector machine (SVM) show 91.3% accuracy in multiple motions. In this paper, five hand gestures ("Promise", "Bunny", "Close", "Victory", and "Thumb") are selected and measured 100 times for testing the utility of spectral classification techniques. Validation results for the five hand gestures were able to be correctly predicted 100% of the time, indicating superior recognition accuracy than those of conventional SVM methods. The hand motion recognition using CNN meant to be applied more useful to AR cognitive rehabilitation training systems based on LMC than sign language recognition using SVM.

Analysis of Face Direction and Hand Gestures for Recognition of Human Motion (인간의 행동 인식을 위한 얼굴 방향과 손 동작 해석)

  • Kim, Seong-Eun;Jo, Gang-Hyeon;Jeon, Hui-Seong;Choe, Won-Ho;Park, Gyeong-Seop
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.7 no.4
    • /
    • pp.309-318
    • /
    • 2001
  • In this paper, we describe methods that analyze a human gesture. A human interface(HI) system for analyzing gesture extracts the head and hand regions after taking image sequence of and operators continuous behavior using CCD cameras. As gestures are accomplished with operators head and hands motion, we extract the head and hand regions to analyze gestures and calculate geometrical information of extracted skin regions. The analysis of head motion is possible by obtaining the face direction. We assume that head is ellipsoid with 3D coordinates to locate the face features likes eyes, nose and mouth on its surface. If was know the center of feature points, the angle of the center in the ellipsoid is the direction of the face. The hand region obtained from preprocessing is able to include hands as well as arms. For extracting only the hand region from preprocessing, we should find the wrist line to divide the hand and arm regions. After distinguishing the hand region by the wrist line, we model the hand region as an ellipse for the analysis of hand data. Also, the finger part is represented as a long and narrow shape. We extract hand information such as size, position, and shape.

  • PDF

A Comparison of Head-Hand Coordination Patterns during Squash Forehand Strokes in Expert and Less-Skilled Squash Players

  • Roh, Miyoung
    • Korean Journal of Applied Biomechanics
    • /
    • v.28 no.2
    • /
    • pp.109-117
    • /
    • 2018
  • Objective: To compare head and hand movement patterns during squash forehand motions between experts and less-skilled squash players. Method: Four experts and four less-skilled squash players participated in this study. They performed squash forehand swings and a VICON motion analysis system was used to obtain displacement and velocity data of the head and right hand during the movement. Mann-Whitney U-tests were performed to compare head and hand range of motion and peak velocity, and cross-correlation was performed to analyze the head-hand coordination pattern between groups in three movement directions. Results: In terms of head and hand kinematic data, experts had greater head range of motion during down swings than less-skilled squash players. Experts seemed to reach peak hand velocity at impact by reaching peak head velocity followed by hand peak velocity within a given temporal sequence. In terms of head-hand coordination patterns, both groups revealed high positive correlations in the medial-lateral direction, indicating a dominant allocentric coordination pattern. However, experts had uncoupled coordination patterns in the vertical direction and less-skilled squash players had high positive correlations. These results indicate that the head-hand movement pattern likely an important factor squash forehand movement. Conclusion: Analysis of head and hand movement patterns could be a key variable in squash training to reach expert-level performance.

Development of a General Purpose Motion Controller Using a Field Programmable Gate Array (FPGA를 이용한 범용 모션 컨트롤러의 개발)

  • Kim, Sung-Soo;Jung, Seul
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.10 no.1
    • /
    • pp.73-80
    • /
    • 2004
  • We have developed a general purpose motion controller using an FPGA(Field Programmable Gate Array). The multi-PID controllers and GUI are implemented as a system-on-chip for multi-axis motion control. Comparing with the commercial motion controller LM 629, since it has multi-independent PID controllers, we have several advantages such as space effectiveness, low cost and lower power consumption. In order to test the performance of the proposed controller, motion of the robot hand is controlled. The robot hand has three fingers with 2 joints each. Finger movements show that tracking was very effective. Another experiment of balancing an inverted pendulum on a cart has been conducted to show the generality of the proposed FPGA PID controller. The controller has well maintained the balance of the pendulum.

Robot User Control System using Hand Gesture Recognizer (수신호 인식기를 이용한 로봇 사용자 제어 시스템)

  • Shon, Su-Won;Beh, Joung-Hoon;Yang, Cheol-Jong;Wang, Han;Ko, Han-Seok
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.4
    • /
    • pp.368-374
    • /
    • 2011
  • This paper proposes a robot control human interface using Markov model (HMM) based hand signal recognizer. The command receiving humanoid robot sends webcam images to a client computer. The client computer then extracts the intended commanding hum n's hand motion descriptors. Upon the feature acquisition, the hand signal recognizer carries out the recognition procedure. The recognition result is then sent back to the robot for responsive actions. The system performance is evaluated by measuring the recognition of '48 hand signal set' which is created randomly using fundamental hand motion set. For isolated motion recognition, '48 hand signal set' shows 97.07% recognition rate while the 'baseline hand signal set' shows 92.4%. This result validates the proposed hand signal recognizer is indeed highly discernable. For the '48 hand signal set' connected motions, it shows 97.37% recognition rate. The relevant experiments demonstrate that the proposed system is promising for real world human-robot interface application.

Presentation control of the computer using the motion identification rules (모션 식별 룰을 이용한 컴퓨터의 프레젠테이션 제어)

  • Lee, Sang-yong;Lee, Kyu-won
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2015.05a
    • /
    • pp.586-589
    • /
    • 2015
  • A computer presentation system by using hand-motion identification rules is proposed. To identify hand motions of a presenter, a face region is extracted first using haar classifier. A motion status(patterns) and position of hands is discriminated using the center of gravities of user's face and hand after segmenting the hand area on the YCbCr color model. User's hand is applied to the motion detection rules and then presentation control command is then executed. The proposed system utilizes the motion identification rules without the use of additional equipment and it is then capable of controlling the presentation and does not depend on the complexity of the background. The proposed algorithm confirmed the stable control operation via the presentation of the experiment in the dark illumination range of indoor atmosphere (lx) 15-20-30.

  • PDF

Fuzzy rule-based Hand Motion Estimation for A 6 Dimensional Spatial Tracker

  • Lee, Sang-Hoon;Kim, Hyun-Seok;Suh, Il-Hong;Park, Myung-Kwan
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.82-86
    • /
    • 2004
  • A fuzzy rule-based hand-motion estimation algorithm is proposed for a 6 dimensional spatial tracker in which low cost accelerometers and gyros are employed. To be specific, beginning and stopping of hand motions needs to be accurately detected to initiate and terminate integration process to get position and pose of the hand from accelerometer and gyro signals, since errors due to noise and/or hand-shaking motions accumulated by integration processes. Fuzzy rules of yes or no of hand-motion-detection are here proposed for rules of accelerometer signals, and sum of derivatives of accelerometer and gyro signals. Several experimental results and shown to validate our proposed algorithms.

  • PDF

Develipment of a hand motion analysis system using a 3-D Glove (3-D Glove를 이용한 손동작의 분석 시스템 개발)

  • 윤명환;권오채;한수미;박재희;이경태
    • Proceedings of the ESK Conference
    • /
    • 1997.10a
    • /
    • pp.393-397
    • /
    • 1997
  • 본 연구에서는 손동작(Hand Motion)과 수작업(Manual Task) 분석에 VR환경에서 사용되는 각도 측정 장갑(3-D Glove)을 이용하는 방법을 제안하였다. 본 연구에서 개발된 손동작(Hand Motion)과 수작업(Manual Task)의 분석 시스템은 18-sensor $Cyberglove^{TM}$정 시스템으로부터 측정된 angle data를 기초로 손동작이나 수작업에 대한 totalmuscle moment값과 total muscle excursion값을 구하고, digit와 joint의 moment값을 X,Y.Z방향별고 구하는 기능을 가지고 있다. 시스템의 구성은 : (1) $Cyberglove^{TM}$ System과 분석 시스템의 digital data 처리를 기반으로 하는 손동작의 측정 시스템 ; (2) $Cyberglove^{TM}$ System에서 얻어진 자료를 바탕으로 3차원 공간에서 손동작을 표현할 수 있는 Kinematic Hand Model ; (3) Hand Model과 $Cyberglove^{TM}$ Systme을 기반으로 3차원에서 손동작의 역학적 분석을 할 수 있는 3-D Hand Biomechanical Model ; 등으로 되어있다. 본 시스템은 Telerobotics, Medicine, Virtual Reality 등 다양한 분야에 응용이 가능하며, 수작업에 관련되는 Product Design, Manual Control Device, Computer I/O Device의 설계에도 도움이 될 것으로 기대된다.

  • PDF

Advanced Representation Method of Hand Motion by Cheremes Analysis in KSL (수화소 분석을 통한 손동작 움직임 표현방법)

  • Lee, Boo-Hyung;Song, Pi1-Jae
    • Journal of Korea Multimedia Society
    • /
    • v.9 no.8
    • /
    • pp.1067-1075
    • /
    • 2006
  • This paper proposes a advanced representation method of hand motion by cheremes analysis in korean sign language. The proposed method is the representation method which apply to the hand motion used in KSL(Korean Sign Language) to represent rich and united hand motion. Words or sentences in KSL are completed by combination of elements called as Cheremes, that is, a hand movement orientation, a finger shape, a hand position, etc. In this paper, Cheremes composing the KSL is divided and represented by 5 elements: the hand movement orientation(HMO), finger shape(FS), hand orientation(HO), hand position(HP) and number of using hand (HN). Each cheremes is expressed by more various characteristics. For example, The hand movement orientation means orientations which the hand move while the sign language is done and can be expressed by 17orientation components. The finger shape means various shapes which fingers can take and represented by 17 components. The Orientation of hand is expressed by 2 characteristics according to whether we use the palm of the hand or the back. The position of hand means specific regions in body which hand(s) is placed while the sign language is done and divided by 8 regions. Finally, the number of hand means whether use only one hand or both hands and is expressed by 2 characteristics. The proposed method has been tested with KSL words and sentences and the results have shown that they can be expressed completely by the proposed representation method.

  • PDF

Door opening control using the multi-fingered robotic hand for the indoor service robot PSR

  • Rhee, Chang-Ju;Shim, Young-Bo;Chung, Woo-Jin;Kim, Mun-Sang;Park, Jong-Hyun
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.1093-1098
    • /
    • 2003
  • In this paper, a practical methodology of hand-manipulator motion coordination for indoor service robot is introduced. This paper describes the procedures of opening door performed by service robot as a noticeable example of motion coordination. This paper presents well-structured framework for hand-manipulator motion coordination, which includes intelligent sensor data interpretation, object shape estimation, optimal grasping, on-line motion planning and behavior-based task execution. This proposed approach is focused on how to integrate the respective functions in harmony and enable the robot to complete its operation under the limitation of usable resources. As a practical example of implementation, the successful experimental results in opening door whose geometric parameters are unknown beforehand are provided.

  • PDF