• 제목/요약/키워드: Motion blending

Search Result 39, Processing Time 0.035 seconds

Hybrid Motion Blending Algorithm of 3-Axis SCARA Robot based on $Labview^{(R)}$ using Parametric Interpolation (매개변수를 이용한 $Labview^{(R)}$ 기반의 3축 SCARA로봇의 이종모션 제어 알고리즘)

  • Chung, Won-Jee;Ju, Ji-Hun;Lee, Kee-Sang
    • Transactions of the Korean Society of Machine Tool Engineers
    • /
    • v.18 no.2
    • /
    • pp.154-161
    • /
    • 2009
  • In order to implement continuous-path motion on a robot, it is necessary to blend one joint motion to another joint motion near a via point in a trapezoidal form of joint velocity. First, the velocity superposition using parametric interpolation is proposed. Hybrid motion blending is defined as the blending of different two type's motions such as blending of joint motion with linear motion, in the neighborhood of a via point. Second, hybrid motion blending algorithm is proposed based on velocity superposition using parametric interpolation. By using a 3-axis SCARA (Selective Compliance Assembly Robot Arm) robot with $LabVIEW^{(R)}$ $controller^{(1)}$, the velocity superposition algorithm using parametric interpolation is shown to result in less vibration, compared with PTP(Point- To-Point) motion and Kim's algorithm. Moreover, the hybrid motion $algorithm^{(2)}$ is implemented on the robot using $LabVIEW^{(R)(1)}$ programming, which is confirmed by showing the end-effector path of joint-linear hybrid motion.

Implementation of LabVIEW®-based Joint-Linear Motion Blending on a Lab-manufactured 6-Axis Articulated Robot (RS2) (LabVIEW® 기반 6축 수직 다관절 로봇(RS2)의 이종 모션 블랜딩 연구)

  • Lee, D.S.;Chung, W.J.;Jang, J.H.;Kim, M.S.
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.22 no.2
    • /
    • pp.318-323
    • /
    • 2013
  • For fast and accurate motion of 6-axis articulated robot, more noble motion control strategy is needed. In general, the movement strategy of industrial robots can be divided into two kinds, PTP (Point to Point) and CP (Continuous Path). Recently, industrial robots which should be co-worked with machine tools are increasingly needed for performing various jobs, as well as simple handling or welding. Therefore, in order to cope with high-speed handling of the cooperation of industrial robots with machine tools or other devices, CP should be implemented so as to reduce vibration and noise, as well as decreasing operation time. This paper will realize CP motion (especially joint-linear) blending in 3-dimensional space for a 6-axis articulated (lab-manufactured) robot (called as "RS2") by using LabVIEW$^{(R)}$ (6) programming, based on a parametric interpolation. Another small contribution of this paper is the proposal of motion blending simulation technique based on Recurdyn$^{(R)}$ V7 and Solidworks$^{(R)}$, in order to figure out whether the joint-linear blending motion can generate the stable motion of robot in the sense of velocity magnitude at the end-effector of robot or not. In order to evaluate the performance of joint-linear motion blending, simple PTP (i.e., linear-linear) is also physically implemented on RS2. The implementation results of joint-linear motion blending and PTP are compared in terms of vibration magnitude and travel time by using the vibration testing equipment of Medallion of Zonic$^{(R)}$. It can be confirmed verified that the vibration peak of joint-linear motion blending has been reduced to 1/10, compared to that of PTP.

Linear Interpolation Transition of Character Animation for Immediate 3D Response to User Motion

  • Lim, Sooyeon
    • International Journal of Contents
    • /
    • v.11 no.1
    • /
    • pp.15-20
    • /
    • 2015
  • The purpose of this research is to study methods for performing transition that have visual representation of corresponding animations with no bounce in subsequently recognized user information when attempting to interact with a virtual 3D character in real-time using user motion. If the transitions of the animation are needed owing to a variety of external environments, continuous recognition of user information is required to correspond to the motion. The proposed method includes linear interpolation of the transition using cross-fades and blending techniques. The normalized playing time of the source animation was utilized for automatically calculating the transition interpolation length of the target animation and also as the criteria in selecting the crossfades and blending techniques. In particular, in the case of blending, the weighting value based on the degree of similarity between two animations is used as a blending parameter. Accordingly, transitions for visually excellent animation are performed on interactive holographic projection systems.

Parametrized Construction of Virtual Drivers' Reach Motion to Seat Belt (매개변수로 제어가능한 운전자의 안전벨트 뻗침 모션 생성)

  • Seo, Hye-Won;Cordier, Frederic;Choi, Woo-Jin;Choi, Hyung-Yun
    • Korean Journal of Computational Design and Engineering
    • /
    • v.16 no.4
    • /
    • pp.249-259
    • /
    • 2011
  • In this paper we present our work on the parameterized construction of virtual drivers' reach motion to seat belt, by using motion capture data. A user can generate a new reach motion by controlling a number of parameters. We approach the problem by using multiple sets of example reach motions and learning the relation between the labeling parameters and the motion data. The work is composed of three tasks. First, we construct a motion database using multiple sets of labeled motion clips obtained by using a motion capture device. This involves removing the redundancy of each motion clip by using PCA (Principal Component Analysis), and establishing temporal correspondence among different motion clips by automatic segmentation and piecewise time warping of each clip. Next, we compute motion blending functions by learning the relation between labeling parameters (age, hip base point (HBP), and height) and the motion parameters as represented by a set of PC coefficients. During runtime, on-line motion synthesis is accomplished by evaluating the motion blending function from the user-supplied control parameters.

Study on BLENDED CAM DESIGN (복합곡선으로 이루어진 캠의 설계에 관한 연구)

  • Yang, Min-Yang;Shon, Tae-Young
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.12 no.9
    • /
    • pp.59-65
    • /
    • 1995
  • The cam is used to a main component in a variety of automatic machines and instruments. To meet the demand of a complicated operation and of reducing weight for automatic machine, Curve Blending Technology, in which each of the basic curves suitable for individual interval is connected, is used for the cam design. In the curve blending, it is necessary to select appropriate elementary curve for each interval and to confirm the dynamic continuity at connecting points between adjoining elementary curves. This paper represented the elementary curve selection method to select an appropriate curve for each interval, and executed computation for the follower displacement and angular displacement of each interval. The paper made an analysis and examine closely for elementary curves to synthesizing curve blending, and it performed dynamic conditions clearly at every points on the cam motions. Therefore the curve blending technology presented by the paper turned into easier work.

  • PDF

View synthesis with sparse light field for 6DoF immersive video

  • Kwak, Sangwoon;Yun, Joungil;Jeong, Jun-Young;Kim, Youngwook;Ihm, Insung;Cheong, Won-Sik;Seo, Jeongil
    • ETRI Journal
    • /
    • v.44 no.1
    • /
    • pp.24-37
    • /
    • 2022
  • Virtual view synthesis, which generates novel views similar to the characteristics of actually acquired images, is an essential technical component for delivering an immersive video with realistic binocular disparity and smooth motion parallax. This is typically achieved in sequence by warping the given images to the designated viewing position, blending warped images, and filling the remaining holes. When considering 6DoF use cases with huge motion, the warping method in patch unit is more preferable than other conventional methods running in pixel unit. Regarding the prior case, the quality of synthesized image is highly relevant to the means of blending. Based on such aspect, we proposed a novel blending architecture that exploits the similarity of the directions of rays and the distribution of depth values. By further employing the proposed method, results showed that more enhanced view was synthesized compared with the well-designed synthesizers used within moving picture expert group (MPEG-I). Moreover, we explained the GPU-based implementation synthesizing and rendering views in the level of real time by considering the applicability for immersive video service.

Biped Animation Blending By 3D Studio MAX Script (맥스 스크립트를 이용한 바이페드 애니메이션 합성)

  • Choe, Hong-Seok
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2008.10a
    • /
    • pp.131-134
    • /
    • 2008
  • 오늘날 3D 캐릭터 애니메이션은 실사영화, 애니메이션, 게임, 광고 등 대다수의 영상물에서 쉽게 접할 수 있다. 캐릭터의 부드러운 움직임은 모션캡쳐(Motion Capture)나 숙련된 애니메이터의 키 프레임(Key Frame) 작업의 결과물일 것이다. 이런 작업들은 고가의 장비나 많은 인력을 요구하고 완성된 결과물은 수정하거나 효과를 주기가 힘들다. 본 연구에서는 3D Studio MAX Script를 이용한 삼차원 회전 값의 연산으로 바이페드(Biped)의 포즈나 애니메이션을 합성하고 보다 사실적인 합성을 위한 방법을 제시하고자 한다.

  • PDF