• 제목/요약/키워드: gesture control

검색결과 187건 처리시간 0.029초

Hybrid HMM for Transitional Gesture Classification in Thai Sign Language Translation

  • Jaruwanawat, Arunee;Chotikakamthorn, Nopporn;Werapan, Worawit
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1106-1110
    • /
    • 2004
  • A human sign language is generally composed of both static and dynamic gestures. Each gesture is represented by a hand shape, its position, and hand movement (for a dynamic gesture). One of the problems found in automated sign language translation is on segmenting a hand movement that is part of a transitional movement from one hand gesture to another. This transitional gesture conveys no meaning, but serves as a connecting period between two consecutive gestures. Based on the observation that many dynamic gestures as appeared in Thai sign language dictionary are of quasi-periodic nature, a method was developed to differentiate between a (meaningful) dynamic gesture and a transitional movement. However, there are some meaningful dynamic gestures that are of non-periodic nature. Those gestures cannot be distinguished from a transitional movement by using the signal quasi-periodicity. This paper proposes a hybrid method using a combination of the periodicity-based gesture segmentation method with a HMM-based gesture classifier. The HMM classifier is used here to detect dynamic signs of non-periodic nature. Combined with the periodic-based gesture segmentation method, this hybrid scheme can be used to identify segments of a transitional movement. In addition, due to the use of quasi-periodic nature of many dynamic sign gestures, dimensionality of the HMM part of the proposed method is significantly reduced, resulting in computational saving as compared with a standard HMM-based method. Through experiment with real measurement, the proposed method's recognition performance is reported.

  • PDF

다변량 퍼지 의사결정트리와 사용자 적응을 이용한 손동작 인식 (Hand Gesture Recognition using Multivariate Fuzzy Decision Tree and User Adaptation)

  • 전문진;도준형;이상완;박광현;변증남
    • 로봇학회논문지
    • /
    • 제3권2호
    • /
    • pp.81-90
    • /
    • 2008
  • While increasing demand of the service for the disabled and the elderly people, assistive technologies have been developed rapidly. The natural signal of human such as voice or gesture has been applied to the system for assisting the disabled and the elderly people. As an example of such kind of human robot interface, the Soft Remote Control System has been developed by HWRS-ERC in $KAIST^[1]$. This system is a vision-based hand gesture recognition system for controlling home appliances such as television, lamp and curtain. One of the most important technologies of the system is the hand gesture recognition algorithm. The frequently occurred problems which lower the recognition rate of hand gesture are inter-person variation and intra-person variation. Intra-person variation can be handled by inducing fuzzy concept. In this paper, we propose multivariate fuzzy decision tree(MFDT) learning and classification algorithm for hand motion recognition. To recognize hand gesture of a new user, the most proper recognition model among several well trained models is selected using model selection algorithm and incrementally adapted to the user's hand gesture. For the general performance of MFDT as a classifier, we show classification rate using the benchmark data of the UCI repository. For the performance of hand gesture recognition, we tested using hand gesture data which is collected from 10 people for 15 days. The experimental results show that the classification and user adaptation performance of proposed algorithm is better than general fuzzy decision tree.

  • PDF

원격 카메라 로봇 제어를 위한 동적 제스처 인식 (Dynamic Gesture Recognition for the Remote Camera Robot Control)

  • 이주원;이병로
    • 한국정보통신학회논문지
    • /
    • 제8권7호
    • /
    • pp.1480-1487
    • /
    • 2004
  • 본 연구에서는 원격 카메라 로봇 제어를 위한 새로운 제스처 인식 방법을 제안하였다. 제스처 인식의 전처리 단계인 동적 제스처의 세그먼테이션이며, 이를 위한 기존의 방법은 인식 대상에 대한 많은 칼라정보를 필요로 하고, 인식단계에서는 각각 제스처에 대한 많은 특징벡터들을 요구하는 단점이 있다. 이러한 단점을 개선하기 위해, 본 연구에서는 동적 제스처의 세그먼테이션을 위한 새로운 Max-Min 탐색법과 제스처 특징 추출을 위한 평균 공간 사상법과 무게중심법, 그리고 인식을 위한 다층 퍼셉트론 신경망의 구조 둥을 제안하였다 실험에서 제안된 기법의 인식율이 90%이상으로 나타났으며, 이 결과는 원격 로봇 제어를 위한 휴먼컴퓨터 인터페이스(HCI : Human Compute. Interface)장치로 사용 가능함을 보였다.

Hand Mouse System Using a Pre-defined Gesture for the Elimination of a TV Remote Controller

  • Kim, Kyung-Won;Bae, Dae-Hee;Yi, Joonhwan;Oh, Seong-Jun
    • IEIE Transactions on Smart Processing and Computing
    • /
    • 제1권2호
    • /
    • pp.88-94
    • /
    • 2012
  • Many hand gesture recognition systems using advanced computer vision techniques to eliminate the need for a TV remote controller have been proposed. Nevertheless, some issues still remain, such as high computational complexity and insufficient information on the target object and background. Moreover, none of the proposed techniques consider how to enter the control mode of the system. This means that they may need a TV remote controller to enter the control mode. This paper proposes a hand mouse system using a pre-defined gesture with high background adaptability. By doing so, a remote controller to enter the control mode of the IPTV system can be eliminated.

  • PDF

Implementation of a Gesture Recognition Signage Platform for Factory Work Environments

  • Rho, Jungkyu
    • International Journal of Internet, Broadcasting and Communication
    • /
    • 제12권3호
    • /
    • pp.171-176
    • /
    • 2020
  • This paper presents an implementation of a gesture recognition platform that can be used in a factory workplaces. The platform consists of signages that display worker's job orders and a control center that is used to manage work orders for factory workers. Each worker does not need to bring work order documents and can browse the assigned work orders on the signage at his/her workplace. The contents of signage can be controlled by worker's hand and arm gestures. Gestures are extracted from body movement tracked by 3D depth camera and converted to the commandsthat control displayed content of the signage. Using the control center, the factory manager can assign tasks to each worker, upload work order documents to the system, and see each worker's progress. The implementation has been applied experimentally to a machining factory workplace. This flatform provides convenience for factory workers when they are working at workplaces, improves security of techincal documents, but can also be used to build smart factories.

Gesture based Natural User Interface for e-Training

  • Lim, C.J.;Lee, Nam-Hee;Jeong, Yun-Guen;Heo, Seung-Il
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.577-583
    • /
    • 2012
  • Objective: This paper describes the process and results related to the development of gesture recognition-based natural user interface(NUI) for vehicle maintenance e-Training system. Background: E-Training refers to education training that acquires and improves the necessary capabilities to perform tasks by using information and communication technology(simulation, 3D virtual reality, and augmented reality), device(PC, tablet, smartphone, and HMD), and environment(wired/wireless internet and cloud computing). Method: Palm movement from depth camera is used as a pointing device, where finger movement is extracted by using OpenCV library as a selection protocol. Results: The proposed NUI allows trainees to control objects, such as cars and engines, on a large screen through gesture recognition. In addition, it includes the learning environment to understand the procedure of either assemble or disassemble certain parts. Conclusion: Future works are related to the implementation of gesture recognition technology for a multiple number of trainees. Application: The results of this interface can be applied not only in e-Training system, but also in other systems, such as digital signage, tangible game, controlling 3D contents, etc.

자연스런 손동작을 이용한 모바일 로봇의 동작제어 (Motion Control of a Mobile Robot Using Natural Hand Gesture)

  • 김아람;이상용
    • 한국지능시스템학회논문지
    • /
    • 제24권1호
    • /
    • pp.64-70
    • /
    • 2014
  • 오늘날 일상생활에서 인간과 함께 생활하는 로봇들은 자연스러운 의사소통 방법이 요구된다. 따라서 기존의 단순한 로봇 제어 방식을 이용하여 제어하는 것 보다 실제 사람과 상호작용 하는 것과 같은 방식의 제어방식이 요구되고 있다. 기존의 연구들은 사람의 행동 자체를 인식하는 것에 초점이 맞추어져 있어서 자연스러운 의사소통을 하기 어렵다. 본 논문에서는 모바일 로봇을 제어하는 방법으로 자연스러운 손동작을 은닉 마르코프 모델(HMM: hidden markov model) 과 퍼지추론을 이용하는 방법을 제안한다. 키넥트 센서를 이용해 색상 데이터와 깊이 데이터를 획득하고 사람의 손을 검색하고 HMM과 Mamdani 퍼지추론을 이용하여 손동작을 인식한다. 인식된 결과를 로봇에게 전달하여 원하는 방향으로 이동시킨다.

기계 장치와의 상호작용을 위한 실시간 저비용 손동작 제어 시스템 (A Real Time Low-Cost Hand Gesture Control System for Interaction with Mechanical Device)

  • 황태훈;김진헌
    • 전기전자학회논문지
    • /
    • 제23권4호
    • /
    • pp.1423-1429
    • /
    • 2019
  • 최근에, 효율적인 상호작용을 지원하는 시스템 인 휴먼 머신 인터페이스(HMI)가 인기를 끌고있다. 본 논문에서는 차량 상호작용방법 중 하나로 새로운 실시간 저비용 손동작 제어 시스템을 제안한다. 계산 시간을 줄이기 위해 RGB 카메라를 사용하여 손 영역을 감지할 때 많은 계산이 필요하므로 TOF (Time-of-Flight) 카메라를 사용하여 깊이 정보를 취득한다. 또한, 푸리에 기술자를 사용하여 학습 모델을 줄였다. 푸리에 디스크립터는 전체 이미지에서 적은 수의 포인트만 사용하므로 학습 모델을 소형화 할 수 있다. 제안 된 기법의 성능을 평가하기 위해 데스크탑과 라즈베리 pi 2의 속도를 비교했다. 실험 결과에 따르면 소형 임베디드와 데스크탑의 성능 차이는 크지 않다. 제스처 인식 실험에서 95.16 %의 인식률이 확인되었다.

강인한 손가락 끝 추출과 확장된 CAMSHIFT 알고리즘을 이용한 자연스러운 Human-Robot Interaction을 위한 손동작 인식 (A Robust Fingertip Extraction and Extended CAMSHIFT based Hand Gesture Recognition for Natural Human-like Human-Robot Interaction)

  • 이래경;안수용;오세영
    • 제어로봇시스템학회논문지
    • /
    • 제18권4호
    • /
    • pp.328-336
    • /
    • 2012
  • In this paper, we propose a robust fingertip extraction and extended Continuously Adaptive Mean Shift (CAMSHIFT) based robust hand gesture recognition for natural human-like HRI (Human-Robot Interaction). Firstly, for efficient and rapid hand detection, the hand candidate regions are segmented by the combination with robust $YC_bC_r$ skin color model and haar-like features based adaboost. Using the extracted hand candidate regions, we estimate the palm region and fingertip position from distance transformation based voting and geometrical feature of hands. From the hand orientation and palm center position, we find the optimal fingertip position and its orientation. Then using extended CAMSHIFT, we reliably track the 2D hand gesture trajectory with extracted fingertip. Finally, we applied the conditional density propagation (CONDENSATION) to recognize the pre-defined temporal motion trajectories. Experimental results show that the proposed algorithm not only rapidly extracts the hand region with accurately extracted fingertip and its angle but also robustly tracks the hand under different illumination, size and rotation conditions. Using these results, we successfully recognize the multiple hand gestures.

피아노 연주 로봇의 개발 (Development of Piano Playing Robot)

  • 박광현;정성훈;;;변증남
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2007년도 심포지엄 논문집 정보 및 제어부문
    • /
    • pp.334-336
    • /
    • 2007
  • This paper presents a beat gesture recognition method to synchronize the tempo of a robot playing a piano with the desired tempo of the user. To detect an unstructured beat gesture expressed by any part of a body, we apply an optical flow method, and obtain the trajectories of the center of gravity and normalized central moments of moving objects in images. The period of a beat gesture is estimated from the results of the fast Fourier transform. In addition, we also apply a motion control method by which robotic fingers are trained to follow a set of trajectories, Since the ability to track the trajectories influences the sound a piano generates, we adopt an iterative learning control method to reduce the tracking error.

  • PDF