• Title/Summary/Keyword: Kinect 장치

Search Result 30, Processing Time 0.03 seconds

Joint Range Measurement and Correction using Kinect Camera (키넥트 카메라를 이용한 관절 가동 범위 측정과 보정)

  • Jeong, Juheon;Yoon, Myeongsuk;Kim, Sangjoon;Park, Gooman
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2019.11a
    • /
    • pp.63-66
    • /
    • 2019
  • 가상현실과 증강현실의 대중화로 사람의 동작을 실시간 3D 애니메이션으로 구현하는 연구가 활발히 진행 중이다. 특히 Microsoft에서 키넥트 (Kinect)를 개발함에 따라 저렴한 가격에 부가적인 장치 필요 없이 간단한 조작만으로도 3D 모션 정보 취득이 가능해졌다. 하지만 키넥트 카메라는 마커 기반 모션 캡쳐 시스템에 비해 관절 정보의 추정 성능이 뒤떨어져 낮은 정확도를 보이는 단점을 지니고 있다. 이에 본 논문에서는 키넥트 카메라를 이용해 사람의 관절 정보를 취득하고 이것에 관절 가동 범위 (Range of Motion, ROM)를 적용하여 비정상적인 동작을 보정하는 시스템을 제안한다. ROM을 구하는 방법으로는 수행자가 모든 관절에 대해 회전 운동을 수행한 뒤 관절들의 회전 운동 정보를 취득, 분석하여 정상적인 ROM을 설정하고 실험으로부터 사람의 동작이 개선되는 것을 확인하였다.

  • PDF

Personal Computer Control Using Kinect (키넥트를 이용한 개인용 컴퓨터 제어)

  • Lee, Min-Kyu;Jeon, Jae-Bong
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2012.06a
    • /
    • pp.343-345
    • /
    • 2012
  • 오늘날 우리는 생활속에서 여러 디지털기기들을 사용하고 있다. 새로운 종류의 여러 디지털기기들이 나타났지만 기존에 사용하던 입력장치의 틀에 얽매여 있다. 키보드, 마우스, 리모콘, 터치패널 등처럼 항상 별도의 컨트롤러를 지녀야 하는 불편함에서 벗어나지 못하고 있다. 이런 이유로 최근에 별도의 컨트롤러 없이 사용자의 움직임을 인식하여 다양한 기능을 수행할 수 있는 키넥트에 대한 관심이 높아지고 있다. 본 연구에서는 키넥트에서 인식하는 손의 움직임 정보를 인식하여 키보드와 마우스를 비롯한 기존 입력장치의 임무를 대신하는 것을 목표로 한다. 키보드 모드는 화면상에 가상의 버튼들을 배치한 후 손의 위치 정보가 버튼안에 있을 때 이벤트를 발생시키는 방법으로 구현한다. 마우스모드는 오른손으로 포인터를 이동하고, 왼손으로 보조 조작이 가능하도록 구현한다. 이 연구를 통해 손이 자유롭지 못하거나 정적이지 못한 상황에서 물리적인 도구가 필요 없는 간단한 조작이 가능하다.

Wearable Robot System Enabling Gaze Tracking and 3D Position Acquisition for Assisting a Disabled Person with Disabled Limbs (시선위치 추적기법 및 3차원 위치정보 획득이 가능한 사지장애인 보조용 웨어러블 로봇 시스템)

  • Seo, Hyoung Kyu;Kim, Jun Cheol;Jung, Jin Hyung;Kim, Dong Hwan
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.37 no.10
    • /
    • pp.1219-1227
    • /
    • 2013
  • A new type of wearable robot is developed for a disabled person with disabled limbs, that is, a person who cannot intentionally move his/her legs and arms. This robot can enable the disabled person to grip an object using eye movements. A gaze tracking algorithm is employed to detect pupil movements by which the person observes the object to be gripped. By using this gaze tracking 2D information, the object is identified and the distance to the object is measured using a Kinect device installed on the robot shoulder. By using several coordinate transformations and a matching scheme, the final 3D information about the object from the base frame can be clearly identified, and the final position data is transmitted to the DSP-controlled robot controller, which enables the target object to be gripped successfully.

Development non-smoking billboard using augmented reality function (증강현실기능을 이용한 금연 광고판 개발)

  • Hong, Jeong-Soo;Lee, Jin-Dong;Yun, Yong-Gyu;Yoo, Jeong-Ki
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2016.10a
    • /
    • pp.274-276
    • /
    • 2016
  • Recently due to increase of tobacco users, many problems have been issued. Not only smoking in public places, smoking indoors as well causes harm to non-smoking people. Smoking booths that are installed but the quality is considerably less and purification devices are not correctly installed, which leads to harm the people around the smoking booth. In this paper, we introduce the "Augmented Reality Billboard" in order for smokers to effectively recognize the non-smoking warning image and healthy warning messages, Kinect Camera Sensor and Augmented Reality (AR) functions are used to recognize the motion of a person to coordinate the corresponding coordinate values.

  • PDF

Computer interface construction for recognizing both motion and voice using kinect (키넥트를 이용한 동작과 음성을 인식하기 위한 컴퓨터 인터페이스 구현)

  • Hwang, Sun-Myung;Yeom, Hee-Gyun;Kim, Beom-Sik;Park, Seong-Joo;Lim, Heung-Taek;Lee, Eun-Kyeong;Kang, Jin-Won;Kim, Jeoung-Seop
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2012.11a
    • /
    • pp.1529-1530
    • /
    • 2012
  • 현재 컴퓨터 조작하는 방식은 입력장치인 키보드로 조작하는 CUI, 마우스로 조작하는 GUI가 있지만 키보드와 마우스를 사용하기 불편한 장소나 불편한 사람이 동작과 음성만으로 컴퓨터를 조작하는 NUI를 구축하여 보다 나은 컴퓨팅환경을 제공하고 새로운 형식의 콘텐츠를 제공할 수 있는 기틀 마련한다.

An application of Motion Tracking for Interactive Art (인터랙티브 아트를 위한 관람자 아바타 생성 기법)

  • Kim, Donghyun;Kim, Sangwook
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2012.04a
    • /
    • pp.375-377
    • /
    • 2012
  • 인터랙티브아트의 상호작용이라는 관점에서 볼 때 관객과 작품이 상호작용하기 위해선 하드웨어적인 장치가 필수적으로 존재해야 한다. Kinect Sensor는 인체를 구성하는 다양한 관절의 좌표를 추출하는 기능을 통해 실시간으로 모션트래킹을 가능하게 하고 이것을 다양한 컨텐츠에 적용하여 활용 할 수 있다. 이 논문은 인체의 관절좌표를 기반으로한 사용자의 2D, 3D아바타를 생성하는 과정에 대해 기술한다. 각각의 방법은 서로 다른 제작방식과 특성을 가지고 있기 때문에 컨텐츠의 성격에 따라 적용할 수 있고 앞으로 이러한 인터랙션적인 부분과 컨텐츠 분야를 연구할 계획이다.

The Comparative Study on Safety Factors at Elevator Management System Operation (승강기 관리시스템 운영 시 안전요소에 관한 비교연구)

  • Park, Joo-Bong;Shin, Seung-Jung
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.14 no.6
    • /
    • pp.159-162
    • /
    • 2014
  • Recently, from 2008 to 2012, there are many of elevator safety patents from patent office survey. In this paper, we analyzed the elevator safety patent from 2009 to 2013, found out safety factor is safety control, emergency stop equipment and door opening equipment. Through reliability statistics, we defined the most important things for safety elevator factor is this three things, studied the reason about many patents. Finally, we propose the Kinect Scada elevator system for prevention safety accident.

Development of Baseball Game Using Leap Motion Controllers (립 모션 컨트롤러를 이용한 야구 게임 개발)

  • Joo, Hyanghan;Cho, Minsoo;In, SeungKyo;Cho, Kyuwon;Min, Jun-Ki
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.5
    • /
    • pp.343-350
    • /
    • 2015
  • While many games have been published that are used with input-devices such as a mouse and keyboard, the number of games that can recognize the behavior of a human utilizing devices such as Kinect and Wii has increased. In this paper, we present the development of a baseball game that utilizes a Leap Motion Controller. A Leap Motion Controller recognizes accurately the movement of a user's fingers. Our implemented game consists of characters, a background and animation. It is a moving animated game in which the users play a game in point of view of a third person. The major feature of our game is that the game players can enjoy the game using a Leap Motion Controller.

Interface of Interactive Contents using Vision-based Body Gesture Recognition (비전 기반 신체 제스처 인식을 이용한 상호작용 콘텐츠 인터페이스)

  • Park, Jae Wan;Song, Dae Hyun;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.1 no.2
    • /
    • pp.40-46
    • /
    • 2012
  • In this paper, we describe interactive contents which is used the result of the inputted interface recognizing vision-based body gesture. Because the content uses the imp which is the common culture as the subject in Asia, we can enjoy it with culture familiarity. And also since the player can use their own gesture to fight with the imp in the game, they are naturally absorbed in the game. And the users can choose the multiple endings of the contents in the end of the scenario. In the part of the gesture recognition, KINECT is used to obtain the three-dimensional coordinates of each joint of the limb to capture the static pose of the actions. The vision-based 3D human pose recognition technology is used to method for convey human gesture in HCI(Human-Computer Interaction). 2D pose model based recognition method recognizes simple 2D human pose in particular environment On the other hand, 3D pose model which describes 3D human body skeletal structure can recognize more complex 3D pose than 2D pose model in because it can use joint angle and shape information of body part Because gestures can be presented through sequential static poses, we recognize the gestures which are configured poses by using HMM In this paper, we describe the interactive content which is used as input interface by using gesture recognition result. So, we can control the contents using only user's gestures naturally. And we intended to improve the immersion and the interest by using the imp who is used real-time interaction with user.

  • PDF

A Study on Sensor-Based Upper Full-Body Motion Tracking on HoloLens

  • Park, Sung-Jun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.26 no.4
    • /
    • pp.39-46
    • /
    • 2021
  • In this paper, we propose a method for the motion recognition method required in the industrial field in mixed reality. In industrial sites, movements (grasping, lifting, and carrying) are required throughout the upper full-body, from trunk movements to arm movements. In this paper, we use a method composed of sensors and wearable devices that are not vision-based such as Kinect without using heavy motion capture equipment. We used two IMU sensors for the trunk and shoulder movement, and used Myo arm band for the arm movements. Real-time data coming from a total of 4 are fused to enable motion recognition for the entire upper body area. As an experimental method, a sensor was attached to the actual clothes, and objects were manipulated through synchronization. As a result, the method using the synchronization method has no errors in large and small operations. Finally, through the performance evaluation, the average result was 50 frames for single-handed operation on the HoloLens and 60 frames for both-handed operation.