• Title/Summary/Keyword: Hand and Finger Motion Generation

Search Result 2, Processing Time 0.017 seconds

Realistic Keyboard Typing Motion Generation Based on Physics Simulation (물리 시뮬레이션에 기반한 사실적인 키보드 타이핑 모션 생성)

  • Jang, Yongho;Eom, Haegwang;Noh, Junyong
    • Journal of the Korea Computer Graphics Society
    • /
    • v.21 no.5
    • /
    • pp.29-36
    • /
    • 2015
  • Human fingers are essential parts of the body that perform complex and detailed motion. Expression of natural finger motion is one of the most important issues in character animation research. Especially, keyboard typing animation is hard to create through the existing animation pipeline because the keyboard typing typically requires a high level of dexterous motion that involves the movement of various joints in a natural way. In this paper, we suggest a method for the generation of realistic keyboard typing motion based on physics simulation. To generate typing motion properly using physics-based simulation, the hand and the keyboard models should be positioned in an allowed range of simulation space, and the typing has to occur at a precise key location according to the input signal. Based on the observation, we incorporate natural tendency that accompanies actual keyboard typing. For example, we found out that the positions of the hands and fingers always assume the default pose, and the idle fingers tend to minimize their motion. We handle these various constraints in one solver to achieve the results of real-time natural keyboard typing simulation. These results can be employed in various animation and virtual reality applications.

Deep Learning-Based Motion Reconstruction Using Tracker Sensors (트래커를 활용한 딥러닝 기반 실시간 전신 동작 복원 )

  • Hyunseok Kim;Kyungwon Kang;Gangrae Park;Taesoo Kwon
    • Journal of the Korea Computer Graphics Society
    • /
    • v.29 no.5
    • /
    • pp.11-20
    • /
    • 2023
  • In this paper, we propose a novel deep learning-based motion reconstruction approach that facilitates the generation of full-body motions, including finger motions, while also enabling the online adjustment of motion generation delays. The proposed method combines the Vive Tracker with a deep learning method to achieve more accurate motion reconstruction while effectively mitigating foot skating issues through the use of an Inverse Kinematics (IK) solver. The proposed method utilizes a trained AutoEncoder to reconstruct character body motions using tracker data in real-time while offering the flexibility to adjust motion generation delays as needed. To generate hand motions suitable for the reconstructed body motion, we employ a Fully Connected Network (FCN). By combining the reconstructed body motion from the AutoEncoder with the hand motions generated by the FCN, we can generate full-body motions of characters that include hand movements. In order to alleviate foot skating issues in motions generated by deep learning-based methods, we use an IK solver. By setting the trackers located near the character's feet as end-effectors for the IK solver, our method precisely controls and corrects the character's foot movements, thereby enhancing the overall accuracy of the generated motions. Through experiments, we validate the accuracy of motion generation in the proposed deep learning-based motion reconstruction scheme, as well as the ability to adjust latency based on user input. Additionally, we assess the correction performance by comparing motions with the IK solver applied to those without it, focusing particularly on how it addresses the foot skating issue in the generated full-body motions.