• Title/Summary/Keyword: Human Motion Capture

Search Result 158, Processing Time 0.026 seconds

Comparison of Three-Dimensional Dynamic Simulation with Falling Gait Analysis (헛디딤 보행특성과 3 차원 모의해석결과 비교)

  • 명성식;금영광;황성재;김한성;김영호
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2004.10a
    • /
    • pp.359-363
    • /
    • 2004
  • Numerous studies have been performed to analyze various phenomena of human's walking, gait. In the present study, unrecognized walking and recognized walking were analyzed by three dimensional motion capture system(VICON motion system Ltd., England) and simulated by computer program. Two normal males participated in measuring the motion of unrecognized and recognized walking. Six infrared cameras and four force plates were used and sixteen reflective markers were attached to the subject to capture the motion. A musculoskeletal model was generated anatomically by using ADAMS(MSC software corp., USA) and LifeMOD(Biomechanics Research Group Inc, USA). The inverse dynamic simulation and forward dynamic simulation were also performed. The result of simulation was similar to the experimental result. This study provides the base line for dynamic simulation of the falling walking. It will be useful to simulate various another pathologic gaits for old peoples.

  • PDF

Parametrized Construction of Virtual Drivers' Reach Motion to Seat Belt (매개변수로 제어가능한 운전자의 안전벨트 뻗침 모션 생성)

  • Seo, Hye-Won;Cordier, Frederic;Choi, Woo-Jin;Choi, Hyung-Yun
    • Korean Journal of Computational Design and Engineering
    • /
    • v.16 no.4
    • /
    • pp.249-259
    • /
    • 2011
  • In this paper we present our work on the parameterized construction of virtual drivers' reach motion to seat belt, by using motion capture data. A user can generate a new reach motion by controlling a number of parameters. We approach the problem by using multiple sets of example reach motions and learning the relation between the labeling parameters and the motion data. The work is composed of three tasks. First, we construct a motion database using multiple sets of labeled motion clips obtained by using a motion capture device. This involves removing the redundancy of each motion clip by using PCA (Principal Component Analysis), and establishing temporal correspondence among different motion clips by automatic segmentation and piecewise time warping of each clip. Next, we compute motion blending functions by learning the relation between labeling parameters (age, hip base point (HBP), and height) and the motion parameters as represented by a set of PC coefficients. During runtime, on-line motion synthesis is accomplished by evaluating the motion blending function from the user-supplied control parameters.

Vision-based Human-Robot Motion Transfer in Tangible Meeting Space (실감만남 공간에서의 비전 센서 기반의 사람-로봇간 운동 정보 전달에 관한 연구)

  • Choi, Yu-Kyung;Ra, Syun-Kwon;Kim, Soo-Whan;Kim, Chang-Hwan;Park, Sung-Kee
    • The Journal of Korea Robotics Society
    • /
    • v.2 no.2
    • /
    • pp.143-151
    • /
    • 2007
  • This paper deals with a tangible interface system that introduces robot as remote avatar. It is focused on a new method which makes a robot imitate human arm motions captured from a remote space. Our method is functionally divided into two parts: capturing human motion and adapting it to robot. In the capturing part, we especially propose a modified potential function of metaballs for the real-time performance and high accuracy. In the adapting part, we suggest a geometric scaling method for solving the structural difference between a human and a robot. With our method, we have implemented a tangible interface and showed its speed and accuracy test.

  • PDF

Implementation of Human Motion Following Robot through Wireless Communication Interface

  • Choi, Hyoukryeol;Jung, Kwangmok;Ryew, SungMoo;Kim, Hunmo;Jeon, Jaewook;Nam, Jaedo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2002.10a
    • /
    • pp.36.3-36
    • /
    • 2002
  • $\textbullet$ Motion capture system $\textbullet$ Exoskeleton mechanism $\textbullet$ Kinematics analysis $\textbullet$ Man-machine Interface $\textbullet$ Wireless communication $\textbullet$ Control algorithm

  • PDF

Training Avatars Animated with Human Motion Data (인간 동작 데이타로 애니메이션되는 아바타의 학습)

  • Lee, Kang-Hoon;Lee, Je-Hee
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.33 no.4
    • /
    • pp.231-241
    • /
    • 2006
  • Creating controllable, responsive avatars is an important problem in computer games and virtual environments. Recently, large collections of motion capture data have been exploited for increased realism in avatar animation and control. Large motion sets have the advantage of accommodating a broad variety of natural human motion. However, when a motion set is large, the time required to identify an appropriate sequence of motions is the bottleneck for achieving interactive avatar control. In this paper, we present a novel method for training avatar behaviors from unlabelled motion data in order to animate and control avatars at minimal runtime cost. Based on machine learning technique, called Q-teaming, our training method allows the avatar to learn how to act in any given situation through trial-and-error interactions with a dynamic environment. We demonstrate the effectiveness of our approach through examples that include avatars interacting with each other and with the user.

Development of Motion Capture System (동작 획득 시스템의 개발)

  • U, Jeong-Jae;Choe, Hyeong-Sik;Kim, Yeong-Sik;Jeon, Dae-Won
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.19 no.10
    • /
    • pp.139-146
    • /
    • 2002
  • We developed a motion capture system to utilize informations on the human walking motion. The system is composed of the mechanical and electronic devices to obtain the joint angle data and the software to analyze the obtained data and to transform the data into the input for a biped walking robot. The mechanical system is composed of a pair of links with 3 revolute joints, on which potentiometers are attached on joint axes to sense rotation angles. Analog signals from potentiometers are transformed into the digital data through the low pass filter and the A/D converter, and then which are stored at the computer. We analyzed the walking characteristics by applying FFT to the digital data, and then performed a 3-D computer simulation using the data. Finally, We apply the processed data to a biped walking robot.

Inertial Motion Sensing-Based Estimation of Ground Reaction Forces during Squat Motion (관성 모션 센싱을 이용한 스쿼트 동작에서의 지면 반력 추정)

  • Min, Seojung;Kim, Jung
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.32 no.4
    • /
    • pp.377-386
    • /
    • 2015
  • Joint force/torque estimation by inverse dynamics is a traditional tool in biomechanical studies. Conventionally for this, kinematic data of human body is obtained by motion capture cameras, of which the bulkiness and occlusion problem make it hard to capture a broad range of movement. As an alternative, inertial motion sensing using cheap and small inertial sensors has been studied recently. In this research, the performance of inertial motion sensing especially to calculate inverse dynamics is studied. Kinematic data from inertial motion sensors is used to calculate ground reaction force (GRF), which is compared to the force plate readings (ground truth) and additionally to the estimation result from optical method. The GRF estimation result showed high correlation and low normalized RMSE(R=0.93, normalized RMSE<0.02 of body weight), which performed even better than conventional optical method. This result guarantees enough accuracy of inertial motion sensing to be used in inverse dynamics analysis.

Biomechanical Analysis and Evaluation Technology Using Human Multi-Body Dynamic Model (인체 다물체 동역학 모델을 이용한 생체역학 분석 및 평가 기술)

  • Kim, Yoon-Hyuk;Shin, June-Ho;Khurelbaatar, Tsolmonbaatar
    • Journal of the Korean Society for Nondestructive Testing
    • /
    • v.31 no.5
    • /
    • pp.494-499
    • /
    • 2011
  • This paper presents the biomechanical analysis and evaluation technology of musculoskeletal system by multi-body human dynamic model and 3-D motion capture data. First, medical image based geometric model and material properties of tissue were used to develop the human dynamic model and 3-D motion capture data based motion analysis techniques were develop to quantify the in-vivo joint kinematics, joint moment, joint force, and muscle force. Walking and push-up motion was investigated using the developed model. The present model and technologies would be useful to apply the biomechanical analysis and evaluation of human activities.

Design of Embedded EPGA for Controlling Humanoid Robot Arms Using Exoskeleton Motion Capture System (Exoskeleton 모션 캡처 장치로 다관절 로봇의 원격제어를 하기 위한 FPGA 임베디드 제어기 설계)

  • Lee, Woon-Kyu;Jung, Seul
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.1
    • /
    • pp.33-38
    • /
    • 2007
  • In this paper, hardware implementation of interface and control between two robots, the master and the slave robot, are designed. The master robot is the motion capturing device that captures motions of the human operator who wears it. The slave robot is the corresponding humanoid robot arms. Captured motions from the master robot are transferred to the slave robot to follow after the master. All hardware designs such as PID controllers, communications between the master robot, encoder counters, and PWM generators are embedded on a single FPGA chip. Experimental studies are conducted to demonstrate the performance of the FPGA controller design.

Feasibility Study of Gait Recognition Using Points in Three-Dimensional Space

  • Kim, Minsung;Kim, Mingon;Park, Sumin;Kwon, Junghoon;Park, Jaeheung
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.13 no.2
    • /
    • pp.124-132
    • /
    • 2013
  • This study investigated the feasibility of gait recognition using points on the body in three-dimensional (3D) space based on comparisons of four different feature vectors. To obtain the point trajectories on the body in 3D, gait motion data were captured from 10 participants using a 3D motion capture system, and four shoes with different heel heights were used to study the effects of heel height on gait recognition. Finally, the recognition rates were compared using four methods and different heel heights.