• Title/Summary/Keyword: interactive rendering

Search Result 100, Processing Time 0.035 seconds

Multi-gigabyte Multimedia Collections Using Qis Visualization Spreadsheet

  • 지승현
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2004.05a
    • /
    • pp.207-214
    • /
    • 2004
  • The Qis visualizational spreadsheet environment is shown to be extremely effective in supporting the visualization of multi-gigabyte multi-dimensional data sets. The Qis has a novel framestack that is the 3-D arrangement of spreadsheet elements. It enables the visualization spreadsheet to effectively manage, rapidly organize, and compactly encapsulate multi-dimensional data sets for visualization. Using several experiments with scientific users, the Qis has been demonstrated to be a highly interactive visual browsing tool for the analysis o( multidimensional data, displaying 2-D 3-D graphics, and rendering in each frame of the spreadsheet.

  • PDF

A Fast Volume Rendering Algorithm for Virtual Endoscopy

  • Ra Jong Beom;Kim Sang Hun;Kwon Sung Min
    • Journal of Biomedical Engineering Research
    • /
    • v.26 no.1
    • /
    • pp.23-30
    • /
    • 2005
  • 3D virtual endoscopy has been used as an alternative non-invasive procedure for visualization of hollow organs. However, due to computational complexity, this is a time-consuming procedure. In this paper, we propose a fast volume rendering algorithm based on perspective ray casting for virtual endoscopy. As a pre-processing step, the algorithm divides a volume into hierarchical blocks and classifies them into opaque or transparent blocks. Then, in the first step, we perform ray casting only for sub-sampled pixels on the image plane, and determine their pixel values and depth information. In the next step, by reducing the sub-sampling factor by half, we repeat ray casting for newly added pixels, and their pixel values and depth information are determined. Here, the previously obtained depth information is utilized to reduce the processing time. This step is recursively performed until a full-size rendering image is acquired. Experiments conducted on a PC show that the proposed algorithm can reduce the rendering time by 70- 80% for bronchus and colon endoscopy, compared with the brute-force ray casting scheme. Using the proposed algorithm, interactive volume rendering becomes more realizable in a PC environment without any specific hardware.

SURE-based-Trous Wavelet Filter for Interactive Monte Carlo Rendering (몬테카를로 렌더링을 위한 슈어기반 실시간 에이트러스 웨이블릿 필터)

  • Kim, Soomin;Moon, Bochang;Yoon, Sung-Eui
    • Journal of KIISE
    • /
    • v.43 no.8
    • /
    • pp.835-840
    • /
    • 2016
  • Monte Carlo ray tracing has been widely used for simulating a diverse set of photo-realistic effects. However, this technique typically produces noise when insufficient numbers of samples are used. As the number of samples allocated per pixel is increased, the rendered images converge. However, this approach of generating sufficient numbers of samples, requires prohibitive rendering time. To solve this problem, image filtering can be applied to rendered images, by filtering the noisy image rendered using low sample counts and acquiring smoothed images, instead of naively generating additional rays. In this paper, we proposed a Stein's Unbiased Risk Estimator (SURE) based $\grave{A}$-Trous wavelet to filter the noise in rendered images in a near-interactive rate. Based on SURE, we can estimate filtering errors associated with $\grave{A}$-Trous wavelet, and identify wavelet coefficients reducing filtering errors. Our approach showed improvement, up to 6:1, over the original $\grave{A}$-Trous filter on various regions in the image, while maintaining a minor computational overhead. We have integrated our propsed filtering method with the recent interactive ray tracing system, Embree, and demonstrated its benefits.

An Efficient Visualization Method for Interactive Volume Rendering (대화식 볼륨 렌더링을 지원하는 효율적인 가시화 방법)

  • Kim, Tae-Young
    • Journal of the Korea Computer Graphics Society
    • /
    • v.8 no.1
    • /
    • pp.1-11
    • /
    • 2002
  • In order to widely use volume rendering technology in practical fields, a user should be able to control the classification parameter interactively and extract a meaningful information easily from the 3D data as fast as it can be. Previous work on an accelerating volume rendering reconstructs an isotropic volume from an anisotropic one and classifies in pre-processing time and then renders the classified volume rapidly in run time. But, this traditional step may result in long pre-processing time and no real-time feedback. In this paper, we present an efficient classification and rendering method that allows a user to set the opacity transfer function interactively at rendering time on a personal computer without special-purpose hardware.

  • PDF

Interactive Non-Photorealistic Rendering Using Pointillism Techniques (점묘 기법을 이용한 상호적 비실사 기법)

  • Han Man-Jun;Oh Se-Yoon;Lim Soon-Bum;Choy Yoon-Chul
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2005.11a
    • /
    • pp.724-726
    • /
    • 2005
  • 지금까지의 컴퓨터 그래픽스의 방향은 사진과 유사하게 표현하는 Photorealism이 강세였으나, 최근 각종 기술문서나 의학, 교육등 많은 분야에서 사진과 같은 컴퓨터 그래픽스보다는 사물의 특징을 강조하고, 불필요한 부분을 생략해서 표현하는 비사실적 렌더링(Non-Photorealistic Rendering)분야가 새롭게 떠오르고 있다. 비실사기법을 통해 생성된 이미지는 사물의 특징을 부각시켜 시현함으로서 사진과 같은 사실적인 묘사보다는 사물의 특징을 통한 의미전달에 초점을 둔 렌더링 기법이라 할 수 있다. 비실사기법에는 수채화, 수묵화, 유화, 목탄화, 데생과 같은 미술기법과 만화와 같이 표현하는 Cartoon Rendering등 다양한 분야가 있지만 본 논문에서는 모든 그림의 기본 요소가 되는 정을 이용하여 사물을 표현하는 점묘법을 적용한 비실사기법에 대해서 이미 발표된 논문을 살펴보고, 새로운 기법을 제안하고자 한다.

  • PDF

Multimodal Interaction on Automultiscopic Content with Mobile Surface Haptics

  • Kim, Jin Ryong;Shin, Seunghyup;Choi, Seungho;Yoo, Yeonwoo
    • ETRI Journal
    • /
    • v.38 no.6
    • /
    • pp.1085-1094
    • /
    • 2016
  • In this work, we present interactive automultiscopic content with mobile surface haptics for multimodal interaction. Our system consists of a 40-view automultiscopic display and a tablet supporting surface haptics in an immersive room. Animated graphics are projected onto the walls of the room. The 40-view automultiscopic display is placed at the center of the front wall. The haptic tablet is installed at the mobile station to enable the user to interact with the tablet. The 40-view real-time rendering and multiplexing technology is applied by establishing virtual cameras in the convergence layout. Surface haptics rendering is synchronized with three-dimensional (3D) objects on the display for real-time haptic interaction. We conduct an experiment to evaluate user experiences of the proposed system. The results demonstrate that the system's multimodal interaction provides positive user experiences of immersion, control, user interface intuitiveness, and 3D effects.

GPU based Maximum Intensity Projection using Clipping Plane Re-rendering Method (절단면 재렌더링 기법을 이용한 GPU 기반 MIP 볼륨 렌더링)

  • Hong, In-Sil;Kye, Hee-Won;Shin, Yeong-Gil
    • Journal of Korea Multimedia Society
    • /
    • v.10 no.3
    • /
    • pp.316-324
    • /
    • 2007
  • Maximum Intensity Projection (MIP) identifies patients' anatomical structures from MR or CT data sets. Recently, it becomes possible to generate MIP images with interactive speed by exploiting Graphics Processing Unit (GPU) even in large volume data sets. Generally, volume boundary plane is obliquely crossed with view-aligned texture plane in hardware-texture based volume rendering. Since the ray sampling distance is not increased at volume boundary in volume rendering, the aliasing problem occurs due to data loss. In this paper, we propose an efficient method to overcome this problem by Re-rendering volume boundary planes. Our method improves image quality to make dense distances between samples near volume boundary which is a high frequency area. Since it is only 6 clipping planes are additionally needed for Re-rendering, high quality rendering can be performed without sacrificing computational efficiency. Furthermore, our method couldbe applied to Minimum Intensity Projection (MinIP) volume rendering.

  • PDF

A Framework for Constructing Interactive Tiled Display Applications (인터랙티브 타일드 디스플레이 응용프로그램 개발을 위한 프레임워크)

  • Cho, Yong-Joo;Kim, Seok-Hwan
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.13 no.1
    • /
    • pp.37-44
    • /
    • 2009
  • This paper describes a new tiled display framework called, iTDF (Interactive Tiled Display Framework), that is designed to support rapid construction of the interactive digital 3D contents running on top of the cluster-based tiled display. This framework allows synchronizing the rendering slaves, sharing software's state over the network, the features, such as, launching multiple applications on a cluster-based computers, moving and resizing windows, synchronization of rendering slaves, distributed shared memory, and unified input interface. This paper analyzes the requirements of the framework and describes the design and implementation of the framework. A couple desktop-based applications are ported with the new iTDF and to find out the usefulness and usability of the framework.

Development of Mobile Volume Visualization System (모바일 볼륨 가시화 시스템 개발)

  • Park, Sang-Hun;Kim, Won-Tae;Ihm, In-Sung
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.12 no.5
    • /
    • pp.286-299
    • /
    • 2006
  • Due to the continuing technical progress in the capabilities of modeling, simulation, and sensor devices, huge volume data with very high resolution are common. In scientific visualization, various interactive real-time techniques on high performance parallel computers to effectively render such large scale volume data sets have been proposed. In this paper, we present a mobile volume visualization system that consists of mobile clients, gateways, and parallel rendering servers. The mobile clients allow to explore the regions of interests adaptively in higher resolution level as well as specify rendering / viewing parameters interactively which are sent to parallel rendering server. The gateways play a role in managing requests / responses between mobile clients and parallel rendering servers for stable services. The parallel rendering servers visualize the specified sub-volume with rendering contexts from clients and then transfer the high quality final images back. This proposed system lets multi-users with PDA simultaneously share commonly interesting parts of huge volume, rendering contexts, and final images through CSCW(Computer Supported Cooperative Work) mode.

Realtime Fabric Rendering with Deformed Anisotropic Reflectance (이방성 반사의 변형을 통한 실시간 옷감 렌더링)

  • Kang, Young-Min
    • Journal of Korea Game Society
    • /
    • v.10 no.4
    • /
    • pp.81-90
    • /
    • 2010
  • In this paper, an efficient method is proposed to produce photorealistic images of woven fabrics without empirical data such as the measured BRDFs(bidirectional reflectance distribution functions). The proposed method is applicable both to ray tracer based offline renderers and to realtime applications such as games. The proposed method models the reflectance properties of woven fabric with alternating anisotropy and deformed MDF(microfacet distribution function). The procedural modeling of the yarn structure effectively and efficiently reproduces plausible rendering of woven fabric. The experimental results show the proposed method can be successfully applied to photorealistic rendering of diverse woven fabric materials even in interactive applications.