• 제목/요약/키워드: MOTION RECOGNITION

검색결과 773건 처리시간 0.04초

A Robust Approach for Human Activity Recognition Using 3-D Body Joint Motion Features with Deep Belief Network

  • Uddin, Md. Zia;Kim, Jaehyoun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제11권2호
    • /
    • pp.1118-1133
    • /
    • 2017
  • Computer vision-based human activity recognition (HAR) has become very famous these days due to its applications in various fields such as smart home healthcare for elderly people. A video-based activity recognition system basically has many goals such as to react based on people's behavior that allows the systems to proactively assist them with their tasks. A novel approach is proposed in this work for depth video based human activity recognition using joint-based motion features of depth body shapes and Deep Belief Network (DBN). From depth video, different body parts of human activities are segmented first by means of a trained random forest. The motion features representing the magnitude and direction of each joint in next frame are extracted. Finally, the features are applied for training a DBN to be used for recognition later. The proposed HAR approach showed superior performance over conventional approaches on private and public datasets, indicating a prominent approach for practical applications in smartly controlled environments.

Selection of features and hidden Markov model parameters for English word recognition from Leap Motion air-writing trajectories

  • Deval Verma;Himanshu Agarwal;Amrish Kumar Aggarwal
    • ETRI Journal
    • /
    • 제46권2호
    • /
    • pp.250-262
    • /
    • 2024
  • Air-writing recognition is relevant in areas such as natural human-computer interaction, augmented reality, and virtual reality. A trajectory is the most natural way to represent air writing. We analyze the recognition accuracy of words written in air considering five features, namely, writing direction, curvature, trajectory, orthocenter, and ellipsoid, as well as different parameters of a hidden Markov model classifier. Experiments were performed on two representative datasets, whose sample trajectories were collected using a Leap Motion Controller from a fingertip performing air writing. Dataset D1 contains 840 English words from 21 classes, and dataset D2 contains 1600 English words from 40 classes. A genetic algorithm was combined with a hidden Markov model classifier to obtain the best subset of features. Combination ftrajectory, orthocenter, writing direction, curvatureg provided the best feature set, achieving recognition accuracies on datasets D1 and D2 of 98.81% and 83.58%, respectively.

자율주행 차량을 위한 교통표지판 인식 및 RANSAC 기반의 모션예측을 통한 추적 (Traffic Sign Recognition, and Tracking Using RANSAC-Based Motion Estimation for Autonomous Vehicles)

  • 김성욱;이준웅
    • 제어로봇시스템학회논문지
    • /
    • 제22권2호
    • /
    • pp.110-116
    • /
    • 2016
  • Autonomous vehicles must obey the traffic laws in order to drive actual roads. Traffic signs erected at the side of roads explain the road traffic information or regulations. Therefore, traffic sign recognition is necessary for the autonomous vehicles. In this paper, color characteristics are first considered to detect traffic sign candidates. Subsequently, we establish HOG (Histogram of Oriented Gradients) features from the detected candidate and recognize the traffic sign through a SVM (Support Vector Machine). However, owing to various circumstances, such as changes in weather and lighting, it is difficult to recognize the traffic signs robustly using only SVM. In order to solve this problem, we propose a tracking algorithm with RANSAC-based motion estimation. Using two-point motion estimation, inlier feature points within the traffic sign are selected and then the optimal motion is calculated with the inliers through a bundle adjustment. This approach greatly enhances the traffic sign recognition performance.

데이터 글로브를 이용한 3차원 손동작 인식 (3-D Hand Motion Recognition Using Data Glove)

  • 김지환;박진우;;김태성
    • 한국HCI학회:학술대회논문집
    • /
    • 한국HCI학회 2009년도 학술대회
    • /
    • pp.324-329
    • /
    • 2009
  • Proactive computing의 핵심 기술인 손동작 인식 (Hand Motion Recognition, HMR) 기술은 인간과 컴퓨터 사이의 상호작용(Human Computer Interaction, HCI) 분야에서 많은 연구가 진행되고 있다. 본 연구에서는 3축 가속도 센서를 부착한 data glove를 제작하고, 3차원 손 모델을 구현한 후, 이를 이용한 손동작 인식 기술을 개발하였다. Data glove는 가상현실에 대한 입력 장치로써 본 논문에서는 3축 가속도 센서를 사용하여 획득된 신호를 wireless communication으로 PC에 전송할 수 있도록 구현하였다. 손 모델링은 ellipsoid를 이용한 kinematic chain 이론 바탕의 3차원 손 모델을 구현하였으며, data glove에서 얻어진 가속도 정보에 rule 기반의 알고리즘을 적용하여 구현된 3차원 손 모델을 통하여 간단한 손동작(가위, 바위, 보)을 인식하였다.

  • PDF

적외선 카메라를 이용한 에어 인터페이스 시스템(AIS) 연구 (A Study on Air Interface System (AIS) Using Infrared Ray (IR) Camera)

  • 김효성;정현기;김병규
    • 정보처리학회논문지B
    • /
    • 제18B권3호
    • /
    • pp.109-116
    • /
    • 2011
  • 본 논문에서는 기계적인 조작 장치 없이 손동작만으로 컴퓨터를 조작할 수 있는 차세대 인터페이스인 에어 인터페이스를 구현하였다. 에어 인터페이스 시스템 구현을 위해 먼저 적외선의 전반사 원리를 이용하였으며, 이후 획득된 적외선 영상에서 손 영역을 분할한다. 매 프레임에서 분할된 손 영역은 이벤트 처리를 위한 손동작 인식부의 입력으로 사용되고, 최종적으로 개별 제어 이벤트에 맵핑된 손동작 인식을 통하여 일반적인 제어를 수행하게 된다. 본 연구에서는 손영역 검출과 추적, 손동작 인식과정을 위해 구현되어진 영상처리 및 인식 기법들이 소개되며, 개발된 에어 인터페이스 시스템은 길거리 광고, 프레젠테이션, 키오스크 등의 그 활용성이 매우 클 것으로 기대된다.

지능형 헬스케어 승마로봇의 모션 메카니즘 개발 (Development of Motion Mechanisms for Health-Care Riding Robots)

  • 김진수;임미섭;임준홍
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2008년도 제39회 하계학술대회
    • /
    • pp.1735-1736
    • /
    • 2008
  • In this research, a riding robot system named as "RideBot" is developed for health-care and entertainments. The developed riding robot can follow the intention of horseman and can simulate the motion of horse. The riding robot mechanisms are used for many functions of attitude detection, motion sensing, recognition, common interface and motion-generations. This riding robot can react on health conditions, bio-signals and intention informations of user. One of the objectives of this research is that the riding robot could catch user motion and operate spontaneous movements. In this paper, we develope the saddle mechanism which can generate 3 degrees-of-freedom riding motion based on the intention of horseman. Also, we develope reins and spur mechanism for the recognition of the horseman's intention estimation and the bio-signal monitoring system for the health care function of a horseman. In order to evaluate the performance of the riding robot system, we tested several riding motions including slow and normal step motion, left and right turn motion.

  • PDF

지능형 UI와 Entertainment를 위한 동작 이해 휴대기기 (Motion-Understanding Cell Phones for Intelligent User Interaction and Entertainment)

  • 조성정;최은석;방원철;양징;조준기;기은광;손준일;김동윤;김상룡
    • 한국HCI학회:학술대회논문집
    • /
    • 한국HCI학회 2006년도 학술대회 1부
    • /
    • pp.684-691
    • /
    • 2006
  • As many functionalities such as cameras and MP3 players are converged to mobile phones, more intuitive and interesting interaction methods are essential. In this paper, we present applications and their enabling technologies for gesture interactive cell phones. They employ gesture recognition and real-time shake detection algorithm for supporting motion-based user interface and entertainment applications respectively. The gesture recognition algorithm classifies users' movement into one of predefined gestures by modeling basic components of acceleration signals and their relationships. The recognition performance is further enhanced by discriminating frequently confusing classes with support vector machines. The shake detection algorithm detects in real time the exact motion moment when the phone is shaken significantly by utilizing variance and mean of acceleration signals. The gesture interaction algorithms show reliable performance for commercialization; with 100 novice users, the average recognition rate was 96.9% on 11 gestures (digits 1-9, O, X) and users' movements were detected in real time. We have applied the motion understanding technologies to Samsung cell phones in Korean, American, Chinese and European markets since May 2005.

  • PDF

Generation of Adaptive Motion Using Quasi-simultaneous Recognition of Plural Targets

  • Mizushima, T.;Minami, M.;Mae, Y.;Sakamoto, Y.;Song, W.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.882-887
    • /
    • 2005
  • The paper describes Quasi-simultaneous recognition of plural targets and motion control of robot based on the recognition. The method searches for targets by model-based matching method using the hybrid GA, and the motion of the robot is generated based on the targets' positions on the image. The method is applied to a soccer robot, and targets are a ball, a goal, and an enemy in the experiment. The Experimental results show robustness and reliability of the proposed method.

  • PDF

Design of OpenCV based Finger Recognition System using binary processing and histogram graph

  • Baek, Yeong-Tae;Lee, Se-Hoon;Kim, Ji-Seong
    • 한국컴퓨터정보학회논문지
    • /
    • 제21권2호
    • /
    • pp.17-23
    • /
    • 2016
  • NUI is a motion interface. It uses the body of the user without the use of HID device such as a mouse and keyboard to control the device. In this paper, we use a Pi Camera and sensors connected to it with small embedded board Raspberry Pi. We are using the OpenCV algorithms optimized for image recognition and computer vision compared with traditional HID equipment and to implement a more human-friendly and intuitive interface NUI devices. comparison operation detects motion, it proposed a more advanced motion sensors and recognition systems fused connected to the Raspberry Pi.

Study on User Interface for a Capacitive-Sensor Based Smart Device

  • Jung, Sun-IL;Kim, Young-Chul
    • 스마트미디어저널
    • /
    • 제8권3호
    • /
    • pp.47-52
    • /
    • 2019
  • In this paper, we designed HW / SW interfaces for processing the signals of capacitive sensors like Electric Potential Sensor (EPS) to detect the surrounding electric field disturbance as feature signals in motion recognition systems. We implemented a smart light control system with those interfaces. In the system, the on/off switch and brightness adjustment are controlled by hand gestures using the designed and fabricated interface circuits. PWM (Pulse Width Modulation) signals of the controller with a driver IC are used to drive the LED and to control the brightness and on/off operation. Using the hand-gesture signals obtained through EPS sensors and the interface HW/SW, we can not only construct a gesture instructing system but also accomplish the faster recognition speed by developing dedicated interface hardware including control circuitry. Finally, using the proposed hand-gesture recognition and signal processing methods, the light control module was also designed and implemented. The experimental result shows that the smart light control system can control the LED module properly by accurate motion detection and gesture classification.