• Title/Summary/Keyword: 손동작인식

Search Result 180, Processing Time 0.023 seconds

Virtual Reality-Based Exercise Games for Finger Rehabilitation Following Chronic Stroke (만성 뇌졸중 환자의 손가락 재활을 위한 가상현실 기반의 운동 게임)

  • Park, Hee-Woo;Kim, Young;Seo, Jung-Yeon;Lee, Hwa-Min
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2017.04a
    • /
    • pp.1100-1102
    • /
    • 2017
  • 손가락 운동은 뇌에 가장 큰 영향을 미친다고 알려져 있으며, 손의 기능은 운동과 감각의 복합 기능을 가지고 있기 때문에 식사하기, 옷 입기 등 일상생활을 하는데 있어 반드시 필요하다. 본 연구는 만성 뇌졸중 환자를 대상으로 손가락 재활치료를 위해 환자의 손동작 인식을 위한 'Real Sense'와 게임 개발 엔진인 'Unity3D'를 연동하여 게임을 개발하는 것을 목적으로 한다. 제안하는 게임은 활동성을 부가함으로써 손가락 재활이라는 특정 목적을 달성하는 기능성 게임이며, 주어진 과제를 단계별로 나누어 진행하도록 하여 난이도를 설정하였다. 우리는 환자들의 게임 참여도를 높이기 위해 딱딱한 화면이 아닌 친숙한 게임형식으로 구성하여 환자들이 지루함 없이 자발적으로 재활치료를 할 수 있도록 도움을 주며, 환자들은 우리의 게임을 이용하며 손가락을 균형 있게 사용함으로써 뇌 활동을 향상시킬 수 있다. 기존의 재활치료는 환자가 직접 병원을 가야하는 불편함과 가격이 비싼 재활 치료 기계를 사야하는 반면에 본 연구에서는 비교적 저렴하고 가벼운 'Real Sense'를 이용하여 시간과 공간에 얽매이지 않고 재활치료를 할 수 있도록 하였다.

A Robust Fingertip Extraction and Extended CAMSHIFT based Hand Gesture Recognition for Natural Human-like Human-Robot Interaction (강인한 손가락 끝 추출과 확장된 CAMSHIFT 알고리즘을 이용한 자연스러운 Human-Robot Interaction을 위한 손동작 인식)

  • Lee, Lae-Kyoung;An, Su-Yong;Oh, Se-Young
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.4
    • /
    • pp.328-336
    • /
    • 2012
  • In this paper, we propose a robust fingertip extraction and extended Continuously Adaptive Mean Shift (CAMSHIFT) based robust hand gesture recognition for natural human-like HRI (Human-Robot Interaction). Firstly, for efficient and rapid hand detection, the hand candidate regions are segmented by the combination with robust $YC_bC_r$ skin color model and haar-like features based adaboost. Using the extracted hand candidate regions, we estimate the palm region and fingertip position from distance transformation based voting and geometrical feature of hands. From the hand orientation and palm center position, we find the optimal fingertip position and its orientation. Then using extended CAMSHIFT, we reliably track the 2D hand gesture trajectory with extracted fingertip. Finally, we applied the conditional density propagation (CONDENSATION) to recognize the pre-defined temporal motion trajectories. Experimental results show that the proposed algorithm not only rapidly extracts the hand region with accurately extracted fingertip and its angle but also robustly tracks the hand under different illumination, size and rotation conditions. Using these results, we successfully recognize the multiple hand gestures.

Garden Alive : An interaction-enabled intelligent garden (Garden Alive : 상호작용 가능한 지능적인 가상 화단)

  • Ha, Tae-Jin;Woo, Woon-Tack
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2005.11b
    • /
    • pp.559-561
    • /
    • 2005
  • 본 논문은 사용자가 감각형 인터페이스를 사용하여 가상공간의 지능적인 화단을 체험, 체감 할 수 있는 시스템(Garden Alive)을 제안한다. 제안된 시스템은 카메라, 조도, 습도 센서를 갖춘 현실공간의 화단과 상호작용에 사용되는 감각형 인터페이스, 진화모듈과 감정모듈을 갖춘 인공지능 모듈, 그리고 가상식물의 성장과 반응을 보여주는 Virtual Garden으로 구성된다. 감각형 인터페이스는 카메라로 사용자의 손동작을 인식하고, 조도센서로 광량을 확인하여, 습도센서로 물의 양을 측정한다. 이러한 정보를 바탕으로 인공지능 모듈은 식물의 진화 방향과 감정상태의 변화를 결정한다. Virtual Garden은 L-system을 기반으로 제작되어 가상식물들은 실제 식물들과 유사한 형태로 성장하도록 한다. 제안된 Garden Alive에서 화단의 식물들은 각각의 개체마다. 유전자를 가지고 있어 식물의 다양성을 볼 수 있고, 빛과 수분 등의 환경요인에 따른 적합도를 평가함으로써 세대를 거듭함에 따라 진화해가는 모습을 볼 수 있다. 마지막으로 단순히 자극에 반응하는 식물이 아닌 사용자와 상호작용에 따른 식물의 감정 변화를 통해 적절한 반응을 보이는 지능적인 식물을 구현하였다. 따라서 제안된 시스템은 오락과 교육을 위한 콘텐츠, 사용자에 따른 개별적 반응을 통해 심리적인 안정을 제공할 수 있는 콘텐츠 등으로 응용될 수 있다.

  • PDF

Hand Gesture Recognition with Convolution Neural Networks for Augmented Reality Cognitive Rehabilitation System Based on Leap Motion Controller (립모션 센서 기반 증강현실 인지재활 훈련시스템을 위한 합성곱신경망 손동작 인식)

  • Song, Keun San;Lee, Hyun Ju;Tae, Ki Sik
    • Journal of Biomedical Engineering Research
    • /
    • v.42 no.4
    • /
    • pp.186-192
    • /
    • 2021
  • In this paper, we evaluated prediction accuracy of Euler angle spectrograph classification method using a convolutional neural networks (CNN) for hand gesture recognition in augmented reality (AR) cognitive rehabilitation system based on Leap Motion Controller (LMC). Hand gesture recognition methods using a conventional support vector machine (SVM) show 91.3% accuracy in multiple motions. In this paper, five hand gestures ("Promise", "Bunny", "Close", "Victory", and "Thumb") are selected and measured 100 times for testing the utility of spectral classification techniques. Validation results for the five hand gestures were able to be correctly predicted 100% of the time, indicating superior recognition accuracy than those of conventional SVM methods. The hand motion recognition using CNN meant to be applied more useful to AR cognitive rehabilitation training systems based on LMC than sign language recognition using SVM.

Color Area Correction Algorithm for Tracking Curved Fingertip (구부러진 손가락 끝점 추적을 위한 컬러 영역 보정 알고리즘)

  • Kang, Sung-Kwan;Chung, Kyung-Yong;Rim, Kee-Wook;Lee, Jung-Hyun
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.10
    • /
    • pp.11-18
    • /
    • 2011
  • In the field of image processing to track the fingertip much research has been done. The most common way to calculate the fingertip first, to extract color information. Then, it uses Blob Coloring algorithms which are expressed in blob functions the skin contour and calculates. The algorithm from contour decides the highest location with the fingertip. But this method when measuring it location from the finger condition which bents is not the actual fingertip and has the problem which detects the location which goes wrong. This paper proposes the color space correction algorithm to tracks the fingertip which bents. The method which proposes when tracking the fingertip from the finger condition which bents solves the problem which measures the location which goes wrong. Aim of this paper in compliance with the propensity of the users forecasts a problem in advance and corrects with improvement at the time of height boil an efficiency. Ultimately, this paper suggests empirical application to verify the adequacy and the validity with the proposed method. Accordingly, the satisfaction and the quality of services will be improved the image recognition.

Vision and Depth Information based Real-time Hand Interface Method Using Finger Joint Estimation (손가락 마디 추정을 이용한 비전 및 깊이 정보 기반 손 인터페이스 방법)

  • Park, Kiseo;Lee, Daeho;Park, Youngtae
    • Journal of Digital Convergence
    • /
    • v.11 no.7
    • /
    • pp.157-163
    • /
    • 2013
  • In this paper, we propose a vision and depth information based real-time hand gesture interface method using finger joint estimation. For this, the areas of left and right hands are segmented after mapping of the visual image and depth information image, and labeling and boundary noise removal is performed. Then, the centroid point and rotation angle of each hand area are calculated. Afterwards, a circle is expanded at following pattern from a centroid point of the hand to detect joint points and end points of the finger by obtaining the midway points of the hand boundary crossing and the hand model is recognized. Experimental results that our method enabled fingertip distinction and recognized various hand gestures fast and accurately. As a result of the experiment on various hand poses with the hidden fingers using both hands, the accuracy showed over 90% and the performance indicated over 25 fps. The proposed method can be used as a without contacts input interface in HCI control, education, and game applications.

Development of Interactive Signage using Floating Hologram (플로팅 홀로그램을 이용한 인터랙티브 사이니지 개발)

  • Kim, Dong-Jing;Jeong, Dong Hyo;Kim, Tae-Yong
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.19 no.4
    • /
    • pp.180-185
    • /
    • 2018
  • We have developed an interactive signage system based on floating hologram by combining hologram technology and ICT technology, which can be competitive to small businesses that have excellent products and services. The developed interactive signage system can be used for publicity and marketing of small business owners at low cost, introducing menus with 3D hologram images, and providing various contents responding to user's hand movements. The developed system is able to detect 10 finger movements at a rate of 290 frames per second in a range of 60cm and a range of 150 degrees. We also confirmed that the virtual touch function operates normally by dividing the user's motion recognition into the hover zone and the touch zone by the physical motion experiment of the leap motion object.

A Remote Control of 6 d.o.f. Robot Arm Based on 2D Vision Sensor (2D 영상센서 기반 6축 로봇 팔 원격제어)

  • Hyun, Woong-Keun
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.17 no.5
    • /
    • pp.933-940
    • /
    • 2022
  • In this paper, the algorithm was developed to recognize hand 3D position through 2D image sensor and implemented a system to remotely control the 6 d.o.f. robot arm by using it. The system consists of a camera that acquires hand position in 2D, a computer that controls robot arm that performs movement by hand position recognition. The image sensor recognizes the specific color of the glove putting on operator's hand and outputs the recognized range and position by including the color area of the glove as a shape of rectangle. We recognize the velocity vector of end effector and control the robot arm by the output data of the position and size of the detected rectangle. Through the several experiments using developed 6 axis robot, it was confirmed that the 6 d.o.f. robot arm remote control was successfully performed.

Visual Touchless User Interface for Window Manipulation (윈도우 제어를 위한 시각적 비접촉 사용자 인터페이스)

  • Kim, Jin-Woo;Jung, Kyung-Boo;Jeong, Seung-Do;Choi, Byung-Uk
    • Journal of KIISE:Software and Applications
    • /
    • v.36 no.6
    • /
    • pp.471-478
    • /
    • 2009
  • Recently, researches for user interface are remarkably processed due to the explosive growth of 3-dimensional contents and applications, and the spread class of computer user. This paper proposes a novel method to manipulate windows efficiently using only the intuitive motion of hand. Previous methods have some drawbacks such as burden of expensive device, high complexity of gesture recognition, assistance of additional information using marker, and so on. To improve the defects, we propose a novel visual touchless interface. First, we detect hand region using hue channel in HSV color space to control window using hand. The distance transform method is applied to detect centroid of hand and curvature of hand contour is used to determine position of fingertips. Finally, by using the hand motion information, we recognize hand gesture as one of predefined seven motions. Recognized hand gesture is to be a command to control window. In the proposed method, user can manipulate windows with sense of depth in the real environment because the method adopts stereo camera. Intuitive manipulation is also available because the proposed method supports visual touch for the virtual object, which user want to manipulate, only using simple motions of hand. Finally, the efficiency of the proposed method is verified via an application based on our proposed interface.

The Development of a Real-Time Hand Gestures Recognition System Using Infrared Images (적외선 영상을 이용한 실시간 손동작 인식 장치 개발)

  • Ji, Seong Cheol;Kang, Sun Woo;Kim, Joon Seek;Joo, Hyonam
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.12
    • /
    • pp.1100-1108
    • /
    • 2015
  • A camera-based real-time hand posture and gesture recognition system is proposed for controlling various devices inside automobiles. It uses an imaging system composed of a camera with a proper filter and an infrared lighting device to acquire images of hand-motion sequences. Several steps of pre-processing algorithms are applied, followed by a background normalization process before segmenting the hand from the background. The hand posture is determined by first separating the fingers from the main body of the hand and then by finding the relative position of the fingers from the center of the hand. The beginning and ending of the hand motion from the sequence of the acquired images are detected using pre-defined motion rules to start the hand gesture recognition. A set of carefully designed features is computed and extracted from the raw sequence and is fed into a decision tree-like decision rule for determining the hand gesture. Many experiments are performed to verify the system. In this paper, we show the performance results from tests on the 550 sequences of hand motion images collected from five different individuals to cover the variations among many users of the system in a real-time environment. Among them, 539 sequences are correctly recognized, showing a recognition rate of 98%.