• Title/Summary/Keyword: Kinect Sensor

Search Result 165, Processing Time 0.02 seconds

Development of living body information and behavior monitoring system for nursing person

  • Ichiki, Ai;Sakamoto, Hidetoshi;Ohbuchi, Yoshifumi
    • Journal of Engineering Education Research
    • /
    • v.17 no.4
    • /
    • pp.15-20
    • /
    • 2014
  • The non-contact easy detecting system of nursing person's body vital information and their behaviors monitoring system are developed, which consist of "Kinect" sensor and thermography camera. The "Kinect" sensor can catch the body contour and the body moving behavior, and output their imaging data realtime. The thermography camera can detect respiration state and body temperature, etc. In this study, the practicability of this system was verified.

Design of System for Upper-limbs Rehabilitation Using Kinect Sensor (키넥트 센서를 이용한 상지재활 시스템의 설계)

  • Park, Myeong-Chul;Jung, Hyon-Chel
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2015.07a
    • /
    • pp.309-310
    • /
    • 2015
  • 본 연구는 뇌졸중 등의 상지 기능 손상에 대하여 원격지에 위치하는 환자에게 치료와 진단을 포함하는 재활 콘텐츠를 제공하는 시스템을 개발하기 위한 설계를 목적으로 하고 있다. 설계 콘텐츠의 구조는 가상현실로 재활과 관련된 상지의 움직임을 제시하고 실제 움직임을 키넥트 센서를 이용하여 추적하여, 시각/청각/촉각 피드백을 제시함으로써 재활 치료 효과와 몰입감을 높이고 상지 운동 기능을 평가하는 것으로 구성된다.

  • PDF

Virtual Fitting Development Based on Hand Gesture Recognition (손동작 인식 기반 Virtual Fitting 개발)

  • Kim, Seung-Yeon;Yu, Min-Ji;Jo, Ha-Jung;Jung, Seung-Won
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2019.05a
    • /
    • pp.596-598
    • /
    • 2019
  • 손동작 인식을 기반으로 한 Virtual fitting 시스템은 Kinect Sensor 를 사용하여 자연스러운 Fitting 을 구현할 수 있다. Kinect Sensor 를 이용한 Pose estimation, Gesture recognition, Virtual fitting 을 구현함으로써 가상으로 의복을 착용하는 소프트웨어를 소개한다.

Viewing Angle-Improved 3D Integral Imaging Display with Eye Tracking Sensor

  • Hong, Seokmin;Shin, Donghak;Lee, Joon-Jae;Lee, Byung-Gook
    • Journal of information and communication convergence engineering
    • /
    • v.12 no.4
    • /
    • pp.208-214
    • /
    • 2014
  • In this paper, in order to solve the problems of a narrow viewing angle and the flip effect in a three-dimensional (3D) integral imaging display, we propose an improved system by using an eye tracking method based on the Kinect sensor. In the proposed method, we introduce two types of calibration processes. First process is to perform the calibration between two cameras within Kinect sensor to collect specific 3D information. Second process is to use a space calibration for the coordinate conversion between the Kinect sensor and the coordinate system of the display panel. Our calibration processes can provide the improved performance of estimation for 3D position of the observer's eyes and generate elemental images in real-time speed based on the estimated position. To show the usefulness of the proposed method, we implement an integral imaging display system using the eye tracking process based on our calibration processes and carry out the preliminary experiments by measuring the viewing angle and flipping effect for the reconstructed 3D images. The experimental results reveal that the proposed method extended the viewing angles and removed the flipping images compared with the conventional system.

A Motion Capture and Mapping System: Kinect Based Human-Robot Interaction Platform (동작포착 및 매핑 시스템: Kinect 기반 인간-로봇상호작용 플랫폼)

  • Yoon, Joongsun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.12
    • /
    • pp.8563-8567
    • /
    • 2015
  • We propose a human-robot interaction(HRI) platform based on motion capture and mapping. Platform consists of capture, processing/mapping, and action parts. A motion capture sensor, computer, and avatar and/or physical robots are selected as capture, processing/mapping, and action part(s), respectively. Case studies-an interactive presentation and LEGO robot car are presented to show the design and implementation process of Kinect based HRI platform.

Extracting Motion Information for Animation Character using Kinect Sensor (Kinect를 이용한 애니메이션 캐릭터 움직임 정보 추출)

  • Choi, Eun-Young;Lee, Soojong;Lee, Imgeun
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2013.07a
    • /
    • pp.289-290
    • /
    • 2013
  • 본 논문에서는 Kinect를 이용하여 효율적으로 애니메이션 캐릭터의 움직임의 제어하는 방법을 제안한다. 대상의 3차원 정보를 획득할 수 있는 Kinect 센서를 이용하여 15개 관절의 정보 값을 추출해 내고, 이를 애니메이션 캐릭터의 관절에 사상시켜 캐릭터의 움직임을 보다 자연스럽게 표현 해 낸다. 실험 결과 공간상에서의 실제 인물의 움직임을 애니메이션 캐릭터가 정교하게 재현함을 확인하였다. 본 연구를 통해 많은 비용과 노력이 필요한 캐릭터 애니메이션 작업을 간단하게 수행할 수 있는 방법을 제시한다.

  • PDF

A study on correcting baseball posture with motion sensor in Kinect (Kinect의 모선 인식 센서를 활용한 야구 자세 교정에 관한 연구)

  • Kim, Yeon Woo;Nasridinov, Aziz
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2018.05a
    • /
    • pp.532-534
    • /
    • 2018
  • 본 논문은 한국에서 야구에 관련 직무에 종사하는 사람 또는 야구를 배우고 싶은 사람에게 도움을 주고자 Kinect의 모션 인식 센서를 이용하여 자세인식에 관련 된 연구 내용이다. 야구를 배우고자 하는 사람들에게 자세에 대한 교정과 자신이 직접 자신의 자세를 보면서 활용할 수 있도록 하는 것이 궁극적 목표이며 프로그램의 제작자의 개입이 없이 사용자가 주제가 될 수 있도록 한다. Kinect에서의 야구 자세를 저장하여 자신의 모습과 비교하여 자세에 대한 피드백을 받을 수 있다. 이 프로그램을 통해 사람들이 좀 더 야구에 대해 쉽게 접근하고 이용할 수 있음이 프로그램의 구현 방향이며 야구를 접하는 사람들에게 자세 부분에 도움을 주고 야구를 즐기는 사람들도 도움을 얻는 기대효과를 가지는 프로그램이다.

A User Interface for Vision Sensor based Indirect Teaching of a Robotic Manipulator (시각 센서 기반의 다 관절 매니퓰레이터 간접교시를 위한 유저 인터페이스 설계)

  • Kim, Tae-Woo;Lee, Hoo-Man;Kim, Joong-Bae
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.10
    • /
    • pp.921-927
    • /
    • 2013
  • This paper presents a user interface for vision based indirect teaching of a robotic manipulator with Kinect and IMU (Inertial Measurement Unit) sensors. The user interface system is designed to control the manipulator more easily in joint space, Cartesian space and tool frame. We use the skeleton data of the user from Kinect and Wrist-mounted IMU sensors to calculate the user's joint angles and wrist movement for robot control. The interface system proposed in this paper allows the user to teach the manipulator without a pre-programming process. This will improve the teaching time of the robot and eventually enable increased productivity. Simulation and experimental results are presented to verify the performance of the robot control and interface system.

Kinect Sensor- based LMA Motion Recognition Model Development

  • Hong, Sung Hee
    • International Journal of Advanced Culture Technology
    • /
    • v.9 no.3
    • /
    • pp.367-372
    • /
    • 2021
  • The purpose of this study is to suggest that the movement expression activity of intellectually disabled people is effective in the learning process of LMA motion recognition based on Kinect sensor. We performed an ICT motion recognition games for intellectually disabled based on movement learning of LMA. The characteristics of the movement through Laban's LMA include the change of time in which movement occurs through the human body that recognizes space and the tension or relaxation of emotion expression. The design and implementation of the motion recognition model will be described, and the possibility of using the proposed motion recognition model is verified through a simple experiment. As a result of the experiment, 24 movement expression activities conducted through 10 learning sessions of 5 participants showed a concordance rate of 53.4% or more of the total average. Learning motion games that appear in response to changes in motion had a good effect on positive learning emotions. As a result of study, learning motion games that appear in response to changes in motion had a good effect on positive learning emotions

Accuracy Comparison of Spatiotemporal Gait Variables Measured by the Microsoft Kinect 2 Sensor Directed Toward and Oblique to the Movement Direction (정면과 측면에 위치시킨 마이크로 소프트 키넥트 2로 측정한 보행 시공간 변인 정확성 비교)

  • Hwang, Jisun;Kim, Eun-jin;Hwang, Seonhong
    • Physical Therapy Korea
    • /
    • v.26 no.1
    • /
    • pp.1-7
    • /
    • 2019
  • Background: The Microsoft Kinect which is a low-cost gaming device has been studied as a promise clinical gait analysis tool having satisfactory reliability and validity. However, its accuracy is only guaranteed when it is properly positioned in front of a subject. Objects: The purpose of this study was to identify the error when the Kinect was positioned at a $45^{\circ}$ angle to the longitudinal walking plane compare with those when the Kinect was positioned in front of a subject. Methods: Sixteen healthy adults performed two testing sessions consisting of walking toward and $45^{\circ}$ obliquely the Kinect. Spatiotemporal outcome measures related to stride length, stride time, step length, step time and walking speed were examined. To assess the error between Kinect and 3D motion analysis systems, mean absolute errors (MAE) were determined and compared. Results: MAE of stride length, stride time, step time and walking speed when the Kinect set in front of subjects were investigated as .36, .04, .20 and .32 respectively. MAE of those when the Kinect placed obliquely were investigated as .67, .09, .37, and .58 respectively. There were significant differences in spatiotemporal outcomes between the two conditions. Conclusion: Based on our study experience, positioning the Kinect directly in front of the person walking towards it provides the optimal spatiotemporal data. Therefore, we concluded that the Kinect should be placed carefully and adequately in clinical settings.