• Title/Summary/Keyword: directional interpolation

Search Result 77, Processing Time 0.021 seconds

A COG Variable Analysis of Air-rolling-breakfall in Judo (유도 공중회전낙법의 COG변인 분석)

  • Kim, Eui-Hwan;Chung, Chae-Wook;Kim, Sung-Sup
    • Korean Journal of Applied Biomechanics
    • /
    • v.15 no.3
    • /
    • pp.117-132
    • /
    • 2005
  • It was to study a following research of "A Kinematic Analysis of Air-rolling-breakfall in Judo". The purpose of this study was to analyze the Center of Gravity(COG) variables when performing Air-rolling-breakfall motion, while passing forward over(PFO) to the vertical-hurdles(2m height, take off board 1m height) in judo. Subjects were four males of Y. University squad, who were trainees of the demonstration exhibition team, representatives of national level judoists and were filmed by four 5-VHS 16mm video cameras(60field/sec.) through the three dimensional film analysis methods.COG variable were anterior-posterior directional COG and linear velocity of COG, vertical directional COG and linear velocity of COG. The data collections of this study were digitized by KWON3D program computed The data were standardized using cubic spline interpolation based by calculating the mean values and the standard deviation calculated for each variables. When performing the Air-rolling-breakfall, from the data analysis and discussions, the conclusions were as follows : 1. Anterior-posterior directional COG(APD-COG) when performing Air-rolling-breakfall motion, while PFO over to the vertical-hurdles(2m height) in judo. The range of APD-COG by forward was $0.31{\sim}0.41m$ in take-off position(event 1), $1.20{\sim}1.33m$ in the air-top position(event 2), $2.12{\sim}2.30m$ in the touch-down position(event 3), gradually and $2.14{\sim}2.32m$ in safety breakfall position(event 4), respectively. 2 The linear velocity of APD-COG was $1.03{\sim}2.14m/sec$. in take-off position(event 1), $1.97{\sim}2.22m/sec$. gradually in the air-top position(event 2), $1.05{\sim}1.32m/sec$. in the touch-down position (event 3), gradual decrease and $0.91{\sim}1.23m/sec$. in the safety breakfall position(event 4), respectively. 3. The vertical directional COG(VD-COG) when performing Air-rolling-breakfall motion, while PFO to the vertical-hurdles(2m height) in judo. The range of VD-COG toward upward from mat was $1.35{\sim}1.46m$ in take-off position(event 1), the highest $2.07{\sim}2.23m$ in the air-top position(event 2), and after rapid decrease $0.3{\sim}0.58m$ in the touch-down position(event 3), gradual decrease $0.22{\sim}0.50m$ in safety breakfall position(event 4), respectively. 4. The linear velocity of VlJ.COG was $1.60{\sim}1.87m/sec$. in take-off position(event 1), $0.03{\sim}0.08m/sec$. gradually in the air-top position(event 2), $-4.37{\sim}\;-4.76m/sec$. gradual decrease in the touch-down position(event 3), gradual decrease and -4.40${\sim}\;-4.77m/sec$. in safety breakfall position(event 4), respectively. When performing Air-rolling-breakfall showed parabolic movement from take-off position to air-top position, and after showed vertical fall movement from air-top position to safety breakfall. In conclusion, Ukemi(breakfall) is safety fall method Therefore, actions need for performing safety fall movement, that decrease and minimize shock and impact during Air-rolling-breakfall from take-off board action to air-top position must be maximize of angular momentum, and after must be minimize in touch-down position and safety breakfall position.

Painterly rendering using density of edges (에지 밀도 정보를 이용한 회화적 렌더링)

  • Lee, Ho-Chang;Park, Young-Sup;Seo, Sang-Hyun;Yoon, Kyung-Hyn
    • Journal of the Korea Computer Graphics Society
    • /
    • v.12 no.4
    • /
    • pp.7-15
    • /
    • 2006
  • The ultimate objective of painterly rendering is to express an inputted image as if it is hand drawn. The factors to express this painterly effect are thickness of the brush, direction, texture and the establishment of criteria judging if the produced brush will be drawn on to the canvas. In this paper, the algorithm using density of the edges in determining the criteria whether the brush will be drawn onto the canvas is proposed. Density of edges refers to the quantity of edge in the specific area. And uses the method of finding the location of the brush to be drawn as a unit of dynamic grid as well as expressing consistent directional through direction interpolation. Also, the texture is expressed using various textured brushes. Considering density of edges,We can express detailed area and abstract area. And it result in more human effect of oil painting.

  • PDF

An Analysis on the Change Pattern of Spatio-Temporal Land Price in Gongju City Using the Geostatistical Methods (공간통계를 이용한 공주시의 시공간적 지가변화패턴 분석)

  • Kim, Jung-Hee
    • Journal of Korean Society for Geospatial Information Science
    • /
    • v.20 no.1
    • /
    • pp.93-99
    • /
    • 2012
  • This study aims to identify spatio-temporal land price change pattern in Gongju city including the area incorporated and surrounding area depending on the Multifunctional Administrative City Construction. For this, GIS data was built by calculating the average land price each 209 Dong and Ri by the time of the year 2000, 2005 and 2010 based on. The first, the change in the land price was to identify in the 5-year intervals through a kriging interpolation as a kind of geostatistical techniques. The second, a trend analysis was conducted to know directional change pattern of the east-west axis and the north-south axis. Finally, the weighted mean center was calculated by the land price at a weight to examine moving direction on the center point of land price, point of view. The result is that the land price change pattern appeared visible higher growth on the eastern built in the Multifunctional Administrative City, moving direction on the center point of the land price appeared that the phenomenon was concentrated in the northeastern area.

Directional Deinterlacing Method Using Local Gradient Features (국부 Gradient 특징을 이용한 방향성 deinterlacing 방법)

  • Woo, Dong-Hun;Eom, Il-Kyu;Kim, Yoo-Shin
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.42 no.5 s.305
    • /
    • pp.41-46
    • /
    • 2005
  • Deinterlacing is the conversion from interlaced to progressive scan image that is considered to be 2 times image interpolation. In this paper, the simple and effective deinterlacing method is proposed based on the local gradient information of neighborhood pixels. In the proposed method, the weights for directions around the pixel to be interpolated are estimated, and the weighted sum for the neighborhood pixels is the final intensity value of the pixel to be interpolated. The proposed method has the structure suitable to practical implementation and can avoid the artifacts due to the wrong detection of edge direction. In the simulation, it showed improved subjective and objective performance than the ELA method and comparable performance compared with the variation of ELA method which has more complex structure and requires a couple of parameters that is determined by experience.

Spatio-Temporal Video De-interlacing Algorithm Based on MAP Estimation (MAP 예측기 기반의 시공간 동영상 순차주사화 알고리즘)

  • Lee, Ho-Taek;Song, Byung-Cheol
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.49 no.2
    • /
    • pp.69-75
    • /
    • 2012
  • This paper presents a novel de-interlacing algorithm that can make up motion compensation errors by using maximum a posteriori (MAP) estimator. First, a proper registration is performed between a current field and its adjacent fields, and the progressive frame corresponding to the current field is found via MAP estimator based on the computed registration information. Here, in order to obtain a stable solution, well-known bilateral total variation (BTV)-based regularization is employed. Next, so-called feathering artifacts are detected on a block basis effectively. So, edge-directional interpolation is applied to the pixels where feathering artifact may happen, instead of the above-mentioned temporal de-interlacing. Experimental results show that the PSNR of the proposed algorithm is on average 4dB higher than that of previous studies and provides the better subjective quality than the previous works.

A Study on the VLSI Design of Efficient Color Interpolation Technique Using Spatial Correlation for CCD/CMOS Image Sensor (화소 간 상관관계를 이용한 CCD/CMOS 이미지 센서용 색 보간 기법 및 VLSI 설계에 관한 연구)

  • Lee, Won-Jae;Lee, Seong-Joo;Kim, Jae-Seok
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.43 no.11 s.353
    • /
    • pp.26-36
    • /
    • 2006
  • In this paper, we propose a cost-effective color filter may (CFA) demosaicing method for digital still cameras in which a single CCD or CMOS image sensor is used. Since a CFA is adopted, we must interpolate missing color values in the red, green and blue channels at each pixel location. While most state-of-the-art algorithms invest a great deal of computational effort in the enhancement of the reconstructed image to overcome the color artifacts, we focus on eliminating the color artifacts with low computational complexity. Using spatial correlation of the adjacent pixels, the edge-directional information of the neighbor pixels is used for determining the edge direction of the current pixel. We apply our method to the state-of-the-art algorithms which use edge-directed methods to interpolate the missing color channels. The experiment results show that the proposed method enhances the demosaiced image qualify from $0.09{\sim}0.47dB$ in PSNR depending on the basis algorithm by removing most of the color artifacts. The proposed method was implemented and verified successfully using verilog HDL and FPGA. It was synthesized to gate-level circuits using 0.25um CMOS standard cell library. The total logic gate count is 12K, and five line memories are used.

GPU-based dynamic point light particles rendering using 3D textures for real-time rendering (실시간 렌더링 환경에서의 3D 텍스처를 활용한 GPU 기반 동적 포인트 라이트 파티클 구현)

  • Kim, Byeong Jin;Lee, Taek Hee
    • Journal of the Korea Computer Graphics Society
    • /
    • v.26 no.3
    • /
    • pp.123-131
    • /
    • 2020
  • This study proposes a real-time rendering algorithm for lighting when each of more than 100,000 moving particles exists as a light source. Two 3D textures are used to dynamically determine the range of influence of each light, and the first 3D texture has light color and the second 3D texture has light direction information. Each frame goes through two steps. The first step is to update the particle information required for 3D texture initialization and rendering based on the Compute shader. Convert the particle position to the sampling coordinates of the 3D texture, and based on this coordinate, update the colour sum of the particle lights affecting the corresponding voxels for the first 3D texture and the sum of the directional vectors from the corresponding voxels to the particle lights for the second 3D texture. The second stage operates on a general rendering pipeline. Based on the polygon world position to be rendered first, the exact sampling coordinates of the 3D texture updated in the first step are calculated. Since the sample coordinates correspond 1:1 to the size of the 3D texture and the size of the game world, use the world coordinates of the pixel as the sampling coordinates. Lighting process is carried out based on the color of the sampled pixel and the direction vector of the light. The 3D texture corresponds 1:1 to the actual game world and assumes a minimum unit of 1m, but in areas smaller than 1m, problems such as stairs caused by resolution restrictions occur. Interpolation and super sampling are performed during texture sampling to improve these problems. Measurements of the time taken to render a frame showed that 146 ms was spent on the forward lighting pipeline, 46 ms on the defered lighting pipeline when the number of particles was 262144, and 214 ms on the forward lighting pipeline and 104 ms on the deferred lighting pipeline when the number of particle lights was 1,024766.