• 제목/요약/키워드: Motion Imitation

검색결과 16건 처리시간 0.03초

진화 알고리즘을 사용한 인간형 로봇의 동작 모방 학습 및 실시간 동작 생성 (Motion Imitation Learning and Real-time Movement Generation of Humanoid Using Evolutionary Algorithm)

  • 박가람;나성권;김창환;송재복
    • 제어로봇시스템학회논문지
    • /
    • 제14권10호
    • /
    • pp.1038-1046
    • /
    • 2008
  • This paper presents a framework to generate human-like movements of a humanoid in real time using the movement primitive database of a human. The framework consists of two processes: 1) the offline motion imitation learning based on an Evolutionary Algorithm and 2) the online motion generation of a humanoid using the database updated bγ the motion imitation teaming. For the offline process, the initial database contains the kinetic characteristics of a human, since it is full of human's captured motions. The database then develops through the proposed framework of motion teaming based on an Evolutionary Algorithm, having the kinetic characteristics of a humanoid in aspect of minimal torque or joint jerk. The humanoid generates human-like movements far a given purpose in real time by linearly interpolating the primitive motions in the developed database. The movement of catching a ball was examined in simulation.

Work chain-based inverse kinematics of robot to imitate human motion with Kinect

  • Zhang, Ming;Chen, Jianxin;Wei, Xin;Zhang, Dezhou
    • ETRI Journal
    • /
    • 제40권4호
    • /
    • pp.511-521
    • /
    • 2018
  • The ability to realize human-motion imitation using robots is closely related to developments in the field of artificial intelligence. However, it is not easy to imitate human motions entirely owing to the physical differences between the human body and robots. In this paper, we propose a work chain-based inverse kinematics to enable a robot to imitate the human motion of upper limbs in real time. Two work chains are built on each arm to ensure that there is motion similarity, such as the end effector trajectory and the joint-angle configuration. In addition, a two-phase filter is used to remove the interference and noise, together with a self-collision avoidance scheme to maintain the stability of the robot during the imitation. Experimental results verify the effectiveness of our solution on the humanoid robot Nao-H25 in terms of accuracy and real-time performance.

Adaptation of Motion Capture Data of Human Arms to a Humanoid Robot Using Optimization

  • Kim, Chang-Hwan;Kim, Do-Ik
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.2126-2131
    • /
    • 2005
  • Interactions of a humanoid with a human are important, when the humanoid is requested to provide people with human-friendly services in unknown or uncertain environment. Such interactions may require more complicated and human-like behaviors from the humanoid. In this work the arm motions of a human are discussed as the early stage of human motion imitation by a humanoid. A motion capture system is used to obtain human-friendly arm motions as references. However the captured motions may not be applied directly to the humanoid, since the differences in geometric or dynamics aspects as length, mass, degrees of freedom, and kinematics and dynamics capabilities exist between the humanoid and the human. To overcome this difficulty a method to adapt captured motions to a humanoid is developed. The geometric difference in the arm length is resolved by scaling the arm length of the humanoid with a constant. Using the scaled geometry of the humanoid the imitation of actor's arm motions is achieved by solving an inverse kinematics problem formulated using optimization. The errors between the captured trajectories of actor arms and the approximated trajectories of humanoid arms are minimized. Such dynamics capabilities of the joint motors as limits of joint position, velocity and acceleration are also imposed on the optimization problem. Two motions of one hand waiving and performing a statement in sign language are imitated by a humanoid through dynamics simulation.

  • PDF

근골격 모델과 참조 모션을 이용한 이족보행 강화학습 (Reinforcement Learning of Bipedal Walking with Musculoskeletal Models and Reference Motions)

  • 전지웅;권태수
    • 한국컴퓨터그래픽스학회논문지
    • /
    • 제29권1호
    • /
    • pp.23-29
    • /
    • 2023
  • 본 논문은 강화학습을 통해 이족보행에 대한 모션 캡처를 통해 참조 모션의 데이터들을 기반으로 근골격 캐릭터의 시뮬레이션을 적은 비용으로 높은 품질의 결과를 얻을 방법을 소개한다. 우리는 참조 모션 데이터를 캐릭터 모델이 수행할 수 있게끔 재설정을 한 후, 강화학습을 통해 해당 모션을 학습하도록 훈련시킨다. 참조 모션 모방과 근육에 대한 최소한의 메타볼릭 에너지를 결합하여 원하는 방향으로 근골격 모델이 이족보행을 수행하게끔 학습한다. 이러한 방법으로 근골격 모델은 기존의 수동으로 설계된 컨트롤러보다 적은 비용으로 학습할 수 있으며 높은 품질의 이족보행을 수행할 수 있게 된다.

긍정감정을 유도하기 위한 모방학습을 이용한 상호작용 시스템 프로토타입 개발 (Development of An Interactive System Prototype Using Imitation Learning to Induce Positive Emotion)

  • 오찬해;강창구
    • 한국정보전자통신기술학회논문지
    • /
    • 제14권4호
    • /
    • pp.239-246
    • /
    • 2021
  • 컴퓨터 그래픽스 및 HCI 분야에서 캐릭터를 만들고 자연스럽게 상호작용하는 시스템에 관한 많은 연구가 있었다. 이와 같은 연구들은 사용자의 행동에 대한 반응에 중점을 두었으며, 사용자에게 긍정적 감정을 끌어내기 위한 캐릭터의 행동 연구는 여전히 어려운 문제로 남아있다. 본 논문에서는 인공지능 기술을 이용하여 가상 캐릭터의 움직임에 따른 사용자의 긍정적 감정을 끌어내기 위한 상호작용 시스템 프로토타입을 개발한다. 제안된 시스템은 표정 인식과 가상 캐릭터의 동작 생성으로 구분된다. 표정 인식을 위해 깊이 카메라를 사용하며 인식된 사용자의 표정 데이터는 동작 생성으로 전달된다. 우리는 개인화된 상호작용 시스템 개발을 위하여 학습모델로서 모방학습을 사용한다. 동작 생성에서는 최초 사용자의 표정 데이터에 따라 무작위 행동을 수행하고 지속적인 모방학습을 통하여 사용자가 긍정적 감정을 끌어낼 수 있는 행동을 학습한다.

실감만남 공간에서의 비전 센서 기반의 사람-로봇간 운동 정보 전달에 관한 연구 (Vision-based Human-Robot Motion Transfer in Tangible Meeting Space)

  • 최유경;나성권;김수환;김창환;박성기
    • 로봇학회논문지
    • /
    • 제2권2호
    • /
    • pp.143-151
    • /
    • 2007
  • This paper deals with a tangible interface system that introduces robot as remote avatar. It is focused on a new method which makes a robot imitate human arm motions captured from a remote space. Our method is functionally divided into two parts: capturing human motion and adapting it to robot. In the capturing part, we especially propose a modified potential function of metaballs for the real-time performance and high accuracy. In the adapting part, we suggest a geometric scaling method for solving the structural difference between a human and a robot. With our method, we have implemented a tangible interface and showed its speed and accuracy test.

  • PDF

Creating Deep Learning-based Acrobatic Videos Using Imitation Videos

  • Choi, Jong In;Nam, Sang Hun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제15권2호
    • /
    • pp.713-728
    • /
    • 2021
  • This paper proposes an augmented reality technique to generate acrobatic scenes from hitting motion videos. After a user shoots a motion that mimics hitting an object with hands or feet, their pose is analyzed using motion tracking with deep learning to track hand or foot movement while hitting the object. Hitting position and time are then extracted to generate the object's moving trajectory using physics optimization and synchronized with the video. The proposed method can create videos for hitting objects with feet, e.g. soccer ball lifting; fists, e.g. tap ball, etc. and is suitable for augmented reality applications to include virtual objects.

단순인체모델 기반 휴머노이드의 인간형 전신동작 생성 (Human-like Whole Body Motion Generation of Humanoid Based on Simplified Human Model)

  • 김창환;김승수;나성권;유범재
    • 로봇학회논문지
    • /
    • 제3권4호
    • /
    • pp.287-299
    • /
    • 2008
  • People have expected a humanoid robot to move as naturally as a human being does. The natural movements of humanoid robot may provide people with safer physical services and communicate with persons through motions more correctly. This work presented a methodology to generate the natural motions for a humanoid robot, which are converted from human motion capture data. The methodology produces not only kinematically mapped motions but dynamically mapped ones. The kinematical mapping reflects the human-likeness in the converted motions, while the dynamical mapping could ensure the movement stability of whole body motions of a humanoid robot. The methodology consists of three processes: (a) Human modeling, (b) Kinematic mapping and (c) Dynamic mapping. The human modeling based on optimization gives the ZMP (Zero Moment Point) and COM (Center of Mass) time trajectories of an actor. Those trajectories are modified for a humanoid robot through the kinematic mapping. In addition to modifying the ZMP and COM trajectories, the lower body (pelvis and legs) motion of the actor is then scaled kinematically and converted to the motion available to the humanoid robot considering dynamical aspects. The KIST humanoid robot, Mahru, imitated a dancing motion to evaluate the methodology, showing the good agreement in the motion.

  • PDF