• Title/Summary/Keyword: Colour Interpolation

Search Result 3, Processing Time 0.019 seconds

Colour Interpolation of Tongue Image in Digital Tongue Image System Blocking Out External Light (디지털 설진 시스템의 색상 보정)

  • Kim, Ji-Hye;Nam, Dong-Hyun
    • The Journal of the Society of Korean Medicine Diagnostics
    • /
    • v.16 no.1
    • /
    • pp.9-18
    • /
    • 2012
  • Objectives The aim of this study is to propose an optimized tongue colour interpolation method to achieve accurate tongue image rendering. Methods We selected 60 colour chips in the chips of DIC color guide selector, and then divided randomly the colour chips into two groups. The colour chips of a group (Gr I) were used for finding the optimized colour correction factor of error and those of the other group (Gr II) were used for verifying the correction factor. We measured colour value of the Gr I colour chips with spectrophotometer, and took the colour chips image with a digital tongue image system (DTIS). We adjusted colour correction factor of error to equal the chip colour from each method. Through that process, we obtained the optimized colour correction factor. To verify the correction factor, we measured colour value of the Gr II colour chips with a spectrophotometer, and took the colour chips image with the DTIS in the two types of colour interpolation mode (auto white balance mode and optimized colour correction factor mode). And then we calculated the CIE-$L^*ab$ colour difference (${\Delta}E$) between colour values measured with the spectrophotometer and those from images taken with the DTIS. Results In auto white balance mode, The mean ${\Delta}E$ between colour values measured with the spectrophotometer and those from images taken with the DTIS was 13.95. On the other hand, in optimized colour correction factor mode, The mean ${\Delta}E$ was 9.55. The correction rate was over 30%. Conclusions In case of interpolating colour of images taken with the DTIS, we suggest that procedure to search the optimized colour correction factor of error should be done first.

A Study on the Color Proofing CMS Development for the KOREA Offset Printing Industry (한국 오프셋 인쇄산업에 적합한 CMS 개발에 관한 연구)

  • Song, Kyung-Chul;Kang, Sang-Hoon
    • Journal of the Korean Graphic Arts Communication Society
    • /
    • v.25 no.1
    • /
    • pp.121-133
    • /
    • 2007
  • The CMS(color management system) software was to enable consistent color reproduction from original to reproduction. The CMS was to create RGB monitor and printer characterization profiles and then use the profiles for device independent color transformation. The implemented CMM(color management module) used the CIELAB color space for the profile connection. Various monitor characterization model was evaluated for proper color transformation. To construct output device profile, SLI(sequential linear interpolation) method was used for the color conversion from CMYK device color to device independent CIELAB color space and tetrahedral interpolation method was used for backward transformation. UCR(under color removal) based black generation algorithm was used to construct CIELAB to CMYK LUT(lookup table). When transforming the CIE Lab colour space to CMYK, it was possible to involve the gray revision method regularized in the brightness into colour transformation process and optimize the colour transformation by black generation method based on UCR technique. For soft copy colour proofing, evaluating several monitor specialism methods showed that LUT algorithm was useful. And it was possible to simplify colour gamut mapping by constructing both the look-up table and the colour gamut mapping algorithm to a reference table.

  • PDF

GPU-based dynamic point light particles rendering using 3D textures for real-time rendering (실시간 렌더링 환경에서의 3D 텍스처를 활용한 GPU 기반 동적 포인트 라이트 파티클 구현)

  • Kim, Byeong Jin;Lee, Taek Hee
    • Journal of the Korea Computer Graphics Society
    • /
    • v.26 no.3
    • /
    • pp.123-131
    • /
    • 2020
  • This study proposes a real-time rendering algorithm for lighting when each of more than 100,000 moving particles exists as a light source. Two 3D textures are used to dynamically determine the range of influence of each light, and the first 3D texture has light color and the second 3D texture has light direction information. Each frame goes through two steps. The first step is to update the particle information required for 3D texture initialization and rendering based on the Compute shader. Convert the particle position to the sampling coordinates of the 3D texture, and based on this coordinate, update the colour sum of the particle lights affecting the corresponding voxels for the first 3D texture and the sum of the directional vectors from the corresponding voxels to the particle lights for the second 3D texture. The second stage operates on a general rendering pipeline. Based on the polygon world position to be rendered first, the exact sampling coordinates of the 3D texture updated in the first step are calculated. Since the sample coordinates correspond 1:1 to the size of the 3D texture and the size of the game world, use the world coordinates of the pixel as the sampling coordinates. Lighting process is carried out based on the color of the sampled pixel and the direction vector of the light. The 3D texture corresponds 1:1 to the actual game world and assumes a minimum unit of 1m, but in areas smaller than 1m, problems such as stairs caused by resolution restrictions occur. Interpolation and super sampling are performed during texture sampling to improve these problems. Measurements of the time taken to render a frame showed that 146 ms was spent on the forward lighting pipeline, 46 ms on the defered lighting pipeline when the number of particles was 262144, and 214 ms on the forward lighting pipeline and 104 ms on the deferred lighting pipeline when the number of particle lights was 1,024766.