• Title/Summary/Keyword: Motion Imitation

Search Result 16, Processing Time 0.056 seconds

Motion Imitation Learning and Real-time Movement Generation of Humanoid Using Evolutionary Algorithm (진화 알고리즘을 사용한 인간형 로봇의 동작 모방 학습 및 실시간 동작 생성)

  • Park, Ga-Lam;Ra, Syung-Kwon;Kim, Chang-Hwan;Song, Jae-Bok
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.14 no.10
    • /
    • pp.1038-1046
    • /
    • 2008
  • This paper presents a framework to generate human-like movements of a humanoid in real time using the movement primitive database of a human. The framework consists of two processes: 1) the offline motion imitation learning based on an Evolutionary Algorithm and 2) the online motion generation of a humanoid using the database updated bγ the motion imitation teaming. For the offline process, the initial database contains the kinetic characteristics of a human, since it is full of human's captured motions. The database then develops through the proposed framework of motion teaming based on an Evolutionary Algorithm, having the kinetic characteristics of a humanoid in aspect of minimal torque or joint jerk. The humanoid generates human-like movements far a given purpose in real time by linearly interpolating the primitive motions in the developed database. The movement of catching a ball was examined in simulation.

Work chain-based inverse kinematics of robot to imitate human motion with Kinect

  • Zhang, Ming;Chen, Jianxin;Wei, Xin;Zhang, Dezhou
    • ETRI Journal
    • /
    • v.40 no.4
    • /
    • pp.511-521
    • /
    • 2018
  • The ability to realize human-motion imitation using robots is closely related to developments in the field of artificial intelligence. However, it is not easy to imitate human motions entirely owing to the physical differences between the human body and robots. In this paper, we propose a work chain-based inverse kinematics to enable a robot to imitate the human motion of upper limbs in real time. Two work chains are built on each arm to ensure that there is motion similarity, such as the end effector trajectory and the joint-angle configuration. In addition, a two-phase filter is used to remove the interference and noise, together with a self-collision avoidance scheme to maintain the stability of the robot during the imitation. Experimental results verify the effectiveness of our solution on the humanoid robot Nao-H25 in terms of accuracy and real-time performance.

Adaptation of Motion Capture Data of Human Arms to a Humanoid Robot Using Optimization

  • Kim, Chang-Hwan;Kim, Do-Ik
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.2126-2131
    • /
    • 2005
  • Interactions of a humanoid with a human are important, when the humanoid is requested to provide people with human-friendly services in unknown or uncertain environment. Such interactions may require more complicated and human-like behaviors from the humanoid. In this work the arm motions of a human are discussed as the early stage of human motion imitation by a humanoid. A motion capture system is used to obtain human-friendly arm motions as references. However the captured motions may not be applied directly to the humanoid, since the differences in geometric or dynamics aspects as length, mass, degrees of freedom, and kinematics and dynamics capabilities exist between the humanoid and the human. To overcome this difficulty a method to adapt captured motions to a humanoid is developed. The geometric difference in the arm length is resolved by scaling the arm length of the humanoid with a constant. Using the scaled geometry of the humanoid the imitation of actor's arm motions is achieved by solving an inverse kinematics problem formulated using optimization. The errors between the captured trajectories of actor arms and the approximated trajectories of humanoid arms are minimized. Such dynamics capabilities of the joint motors as limits of joint position, velocity and acceleration are also imposed on the optimization problem. Two motions of one hand waiving and performing a statement in sign language are imitated by a humanoid through dynamics simulation.

  • PDF

Reinforcement Learning of Bipedal Walking with Musculoskeletal Models and Reference Motions (근골격 모델과 참조 모션을 이용한 이족보행 강화학습)

  • Jiwoong Jeon;Taesoo Kwon
    • Journal of the Korea Computer Graphics Society
    • /
    • v.29 no.1
    • /
    • pp.23-29
    • /
    • 2023
  • In this paper, we introduce a method to obtain high-quality results at a low cost for simulating musculoskeletal characters based on data from the reference motion through motion capture on two-legged walking through reinforcement learning. We reset the motion data of the reference motion to allow the character model to perform, and then train the corresponding motion to be learned through reinforcement learning. We combine motion imitation of the reference model with minimal metabolic energy for the muscles to learn to allow the musculoskeletal model to perform two-legged walking in the desired direction. In this way, the musculoskeletal model can learn at a lower cost than conventional manually designed controllers and perform high-quality bipedal walking.

Development of An Interactive System Prototype Using Imitation Learning to Induce Positive Emotion (긍정감정을 유도하기 위한 모방학습을 이용한 상호작용 시스템 프로토타입 개발)

  • Oh, Chanhae;Kang, Changgu
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.14 no.4
    • /
    • pp.239-246
    • /
    • 2021
  • In the field of computer graphics and HCI, there are many studies on systems that create characters and interact naturally. Such studies have focused on the user's response to the user's behavior, and the study of the character's behavior to elicit positive emotions from the user remains a difficult problem. In this paper, we develop a prototype of an interaction system to elicit positive emotions from users according to the movement of virtual characters using artificial intelligence technology. The proposed system is divided into face recognition and motion generation of a virtual character. A depth camera is used for face recognition, and the recognized data is transferred to motion generation. We use imitation learning as a learning model. In motion generation, random actions are performed according to the first user's facial expression data, and actions that the user can elicit positive emotions are learned through continuous imitation learning.

Vision-based Human-Robot Motion Transfer in Tangible Meeting Space (실감만남 공간에서의 비전 센서 기반의 사람-로봇간 운동 정보 전달에 관한 연구)

  • Choi, Yu-Kyung;Ra, Syun-Kwon;Kim, Soo-Whan;Kim, Chang-Hwan;Park, Sung-Kee
    • The Journal of Korea Robotics Society
    • /
    • v.2 no.2
    • /
    • pp.143-151
    • /
    • 2007
  • This paper deals with a tangible interface system that introduces robot as remote avatar. It is focused on a new method which makes a robot imitate human arm motions captured from a remote space. Our method is functionally divided into two parts: capturing human motion and adapting it to robot. In the capturing part, we especially propose a modified potential function of metaballs for the real-time performance and high accuracy. In the adapting part, we suggest a geometric scaling method for solving the structural difference between a human and a robot. With our method, we have implemented a tangible interface and showed its speed and accuracy test.

  • PDF

Creating Deep Learning-based Acrobatic Videos Using Imitation Videos

  • Choi, Jong In;Nam, Sang Hun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.713-728
    • /
    • 2021
  • This paper proposes an augmented reality technique to generate acrobatic scenes from hitting motion videos. After a user shoots a motion that mimics hitting an object with hands or feet, their pose is analyzed using motion tracking with deep learning to track hand or foot movement while hitting the object. Hitting position and time are then extracted to generate the object's moving trajectory using physics optimization and synchronized with the video. The proposed method can create videos for hitting objects with feet, e.g. soccer ball lifting; fists, e.g. tap ball, etc. and is suitable for augmented reality applications to include virtual objects.

Human-like Whole Body Motion Generation of Humanoid Based on Simplified Human Model (단순인체모델 기반 휴머노이드의 인간형 전신동작 생성)

  • Kim, Chang-Hwan;Kim, Seung-Su;Ra, Syung-Kwon;You, Bum-Jae
    • The Journal of Korea Robotics Society
    • /
    • v.3 no.4
    • /
    • pp.287-299
    • /
    • 2008
  • People have expected a humanoid robot to move as naturally as a human being does. The natural movements of humanoid robot may provide people with safer physical services and communicate with persons through motions more correctly. This work presented a methodology to generate the natural motions for a humanoid robot, which are converted from human motion capture data. The methodology produces not only kinematically mapped motions but dynamically mapped ones. The kinematical mapping reflects the human-likeness in the converted motions, while the dynamical mapping could ensure the movement stability of whole body motions of a humanoid robot. The methodology consists of three processes: (a) Human modeling, (b) Kinematic mapping and (c) Dynamic mapping. The human modeling based on optimization gives the ZMP (Zero Moment Point) and COM (Center of Mass) time trajectories of an actor. Those trajectories are modified for a humanoid robot through the kinematic mapping. In addition to modifying the ZMP and COM trajectories, the lower body (pelvis and legs) motion of the actor is then scaled kinematically and converted to the motion available to the humanoid robot considering dynamical aspects. The KIST humanoid robot, Mahru, imitated a dancing motion to evaluate the methodology, showing the good agreement in the motion.

  • PDF