• 제목/요약/키워드: Motion Capturing

검색결과 112건 처리시간 0.037초

동작 특성 추출 : 동작 모방에 기초한 향상된 역 운동학 (Motion Characteristic Capturing : Example Guided Inverse Kinematics)

  • 탁세윤
    • 한국시뮬레이션학회:학술대회논문집
    • /
    • 한국시뮬레이션학회 1999년도 춘계학술대회 논문집
    • /
    • pp.147-151
    • /
    • 1999
  • This paper extends and enhances the existing inverse kinematics technique using the concept of motion characteristic capturing. Motion characteristic capturing is not about measuring motion by tracking body points. Instead, it starts from pre-measured motion data, extracts the motion characteristics, and applies them in animating other bodies. The resulting motion resembles the originally measured one in spite of arbitrary dimensional differences between the bodies. Motion characteristics capturing is a new principle in kinematic motion generalization to process measurements and generate realistic animation of human being or other living creatures.

  • PDF

Exoskeleton 형태의 모션 캡쳐 장치를 이용한 이동로봇의 원격 제어 (Teleoperated Control of a Mobile Robot Using an Exoskeleton-Type Motion Capturing Device Through Wireless Communication)

  • 전풍우;정슬
    • 제어로봇시스템학회논문지
    • /
    • 제10권5호
    • /
    • pp.434-441
    • /
    • 2004
  • In this paper, an exoskeleton-type motion capturing system is designed and implemented. The device is designed to have 12 degree-of-freedom entirely to represent human arm motions. Forward and inverse kinematics of the device are analyzed to make sure of its singular positions. With the designed model parameters, simulation studies are conducted to verify that the designed motion capturing system is effective to represent human motions within the workspace. As a counterpart of the exoskeleton system, a mobile robot is built to follow human motion restrictively. Experimental studies of teleoperation from the exoskeleton device to control the mobile robot are carried out to show feasible application of wireless man-machine interface.

Computerized Human Body Modeling and Work Motion-capturing in a 3-D Virtual Clothing Simulation System for Painting Work Clothes Development

  • Park, Gin Ah
    • 패션비즈니스
    • /
    • 제19권3호
    • /
    • pp.130-143
    • /
    • 2015
  • By studying 3-D virtual human modeling, motion-capturing and clothing simulation for easier and safer work clothes development, this research aimed (1) to categorize heavy manufacturing work motions; (2) to generate a 3-D virtual male model and establish painting work motions within a 3-D virtual clothing simulation system through computerized body scanning and motion-capturing; and finally (3) to suggest simulated clothing images of painting work clothes developed based on virtual male avatar body measurements by implementing the work motions defined in the 3-D virtual clothing simulation system. For this, a male subject's body was 3-D scanned and also directly measured. The procedures to edit a 3-D virtual model required the total body shape to be 3-D scanned into a digital format, which was revised using 3-D Studio MAX and Maya rendering tools. In addition, heavy industry workers' work motions were observed and recorded by video camera at manufacturing sites and analyzed to categorize the painting work motions. This analysis resulted in 4 categories of motions: standing, bending, kneeling and walking. Besides, each work motion category was divided into more detailed motions according to sub-work posture factors: arm angle, arm direction, elbow bending angle, waist bending angle, waist bending direction and knee bending angle. Finally, the implementation of the painting work motions within the 3-D clothing simulation system presented the virtual painting work clothes images simulated in a dynamic mode.

Advance Crane Lifting Safety through Real-time Crane Motion Monitoring and Visualization

  • Fang, Yihai;Cho, Yong K.
    • 국제학술발표논문집
    • /
    • The 6th International Conference on Construction Engineering and Project Management
    • /
    • pp.321-323
    • /
    • 2015
  • Monitoring crane motion in real time is the first step to identifying and mitigating crane-related hazards on construction sites. However, no accurate and reliable crane motion capturing technique is available to serve this purpose. The objective of this research is to explore a method for real-time crane motion capturing and investigate an approach for assisting hazard detection. To achieve this goal, this research employed various techniques including: 1) a sensor-based method that accurately, reliably, and comprehensively captures crane motions in real-time; 2) computationally efficient algorithms for fusing and processing sensing data (e.g., distance, angle, acceleration) from different types of sensors; 3) an approach that integrates crane motion data with known as-is environment data to detect hazards associated with lifting tasks; and 4) a strategy that effectively presents crane operator with crane motion information and warn them with potential hazards. A prototype system was developed and tested on a real crane in a field environment. The results show that the system is able to continuously and accurately monitor crane motion in real-time.

  • PDF

실감만남 공간에서의 비전 센서 기반의 사람-로봇간 운동 정보 전달에 관한 연구 (Vision-based Human-Robot Motion Transfer in Tangible Meeting Space)

  • 최유경;나성권;김수환;김창환;박성기
    • 로봇학회논문지
    • /
    • 제2권2호
    • /
    • pp.143-151
    • /
    • 2007
  • This paper deals with a tangible interface system that introduces robot as remote avatar. It is focused on a new method which makes a robot imitate human arm motions captured from a remote space. Our method is functionally divided into two parts: capturing human motion and adapting it to robot. In the capturing part, we especially propose a modified potential function of metaballs for the real-time performance and high accuracy. In the adapting part, we suggest a geometric scaling method for solving the structural difference between a human and a robot. With our method, we have implemented a tangible interface and showed its speed and accuracy test.

  • PDF

단일곡률궤적을 이용한 이동물체의 포획 알고리즘 (A Capturing Algorithm of Moving Object using Single Curvature Trajectory)

  • 최병석;이장명
    • 제어로봇시스템학회논문지
    • /
    • 제12권2호
    • /
    • pp.145-153
    • /
    • 2006
  • An optimal capturing trajectory for a moving object is proposed in this paper based on the observation that a single-curvature path is more accurate than double-or triple-curvature paths. Moving distance, moving time, and trajectory error are major factors considered in deciding an optimal path for capturing the moving object. That is, the moving time and distance are minimized while the trajectory error is maintained as small as possible. The three major factors are compared for the single and the double curvature trajectories to show superiority of the single curvature trajectory. Based upon the single curvature trajectory, a kinematics model of a mobile robot is proposed to follow and capture the moving object, in this paper. A capturing scenario can be summarized as follows: 1. Motion of the moving object has been captured by a CCD camera., 2. Position of the moving object has been estimated using the image frames, and 3. The mobile robot tries to follow the moving object along the single curvature trajectory which matches positions and orientations of the moving object and the mobile robot at the final moment. Effectiveness of the single curvature trajectory modeling and capturing algorithm has been proved, through simulations and real experiments using a 2-DOF wheel-based mobile robot.

관성센서 기반 신발형 보행 분석기의 신뢰성 연구 (Reliability of 3D-Inertia Measurement Unit Based Shoes in Gait Analysis)

  • 주지용;김영관;박재영
    • 한국운동역학회지
    • /
    • 제25권1호
    • /
    • pp.123-130
    • /
    • 2015
  • Purpose : The purpose of this study was to investigate the reliability of 3D-inertia measurement unit (IMU) based shoes in gait analysis. This was done with respect to the results of the optical motion capturing system and to collect reference gait data of healthy subjects with this device. Methods : The Smart Balance$^{(R)}$ system of 3D-IMU based shoes and Osprey$^{(R)}$ motion capturing cameras were used to collect motion data simultaneously. Forty four healthy subjects consisting of individuals in 20s (N=20), 40s (N=13), and 60s (N=11) participated in this study voluntarily. They performed natural walking on a treadmill for one minute at 4 different target speeds (3, 4, 5, 6 km/h), respectively. Results : Cadence (ICC=.998), step length (ICC=.970), stance phase (ICC=.845), and double-support phase (ICC=.684) from 3D-IMU based shoes were in agreement with results of optical motion system. Gait data of healthy subjects according to different treadmill speeds and ages were matched to previous literature showing increased cadence and reduced step length for elderly subjects. Conclusion : Conclusively, 3D-IMU based shoes in gait analysis were a satisfactory alternative option in measuring linear gait parameters.

융합센서 기반의 모션캡처 시스템 (Motion Capture System using Integrated Pose Sensors)

  • 김병열;한영준;한헌수
    • 한국컴퓨터정보학회논문지
    • /
    • 제15권4호
    • /
    • pp.65-74
    • /
    • 2010
  • 본 논문에서는 기존의 광학식 모션 캡처에서 생길 수 있는 마커들 간의 간섭이나 복잡한 시스템 구성으로 인한 시스템 설치의 복잡성 문제들을 해결하기 위해, 2차원 위치정보를 제공하는 단일 카메라와 특정부위의 방향정보를 제공하는 가속도센서와 자이로 센서로 구성된 동작센서를 융합하여 간편한 모션 캡처를 실현하는 새로운 기법을 제안한다. 본 논문의 동작 인식은 크게 영상기반 위치 정보와 동작센서기반 방향 정보의 융합을 통해 이루어진다. 영상은 보이는 부위에 장착된 컬러마커의 위치를 기준점으로 제공하고, 동작센서들은 각 패지의 이동방향과 속도를 측정하여 영상에서 제공하는 마커들의 3차원 포즈정보를 알아 낼 수 있다. 제안하는 시스템은 사람동작의 측정에 필요한 최소한의 센서정보를 사용함으로써 시스템의 구성과 센서의 설치가 매우 간단하며 경제적이라는 장점을 갖는다. 이러한 장점은 다양한 실험을 통해 검증하였다.

Proposal of Camera Gesture Recognition System Using Motion Recognition Algorithm

  • Moon, Yu-Sung;Kim, Jung-Won
    • 전기전자학회논문지
    • /
    • 제26권1호
    • /
    • pp.133-136
    • /
    • 2022
  • This paper is about motion gesture recognition system, and proposes the following improvement to the flaws of the current system: a motion gesture recognition system and such algorithm that uses the video image of the entire hand and reading its motion gesture to advance the accuracy of recognition. The motion gesture recognition system includes, an image capturing unit that captures and obtains the images of the area applicable for gesture reading, a motion extraction unit that extracts the motion area of the image, and a hand gesture recognition unit that read the motion gestures of the extracted area. The proposed application of the motion gesture algorithm achieves 20% improvement compared to that of the current system.