• Title/Summary/Keyword: remote rendering

Search Result 38, Processing Time 0.023 seconds

Volume Haptic Rendering Algorithm for Realistic Modeling (실감형 모델링을 위한 볼륨 햅틱 렌더링 알고리즘)

  • Jung, Ji-Chan;Park, Joon-Young
    • Korean Journal of Computational Design and Engineering
    • /
    • v.15 no.2
    • /
    • pp.136-143
    • /
    • 2010
  • Realistic Modeling is to maximize the reality of the environment in which perception is made by virtual environment or remote control using two or more senses of human. Especially, the field of haptic rendering, which provides reality through interaction of visual and tactual sense in realistic model, has brought attention. Haptic rendering calculates the force caused by model deformation during interaction with a virtual model and returns it to the user. Deformable model in the haptic rendering has more complexity than a rigid body because the deformation is calculated inside as well as the outside the model. For this model, Gibson suggested the 3D ChainMail algorithm using volumetric data. However, in case of the deformable model with non-homogeneous materials, there were some discordances between visual and tactual sense information when calculating the force-feedback in real time. Therefore, we propose an algorithm for the Volume Haptic Rendering of non-homogeneous deformable object that reflects the force-feedback consistently in real time, depending on visual information (the amount of deformation), without any post-processing.

DESIGN AND IMPLEMENTATION OF FEATURE-BASED 3D GEO-SPATIAL RENDERING SYSTEM USING OPENGL API

  • Kim Seung-Yeb;Lee Kiwon
    • Proceedings of the KSRS Conference
    • /
    • 2005.10a
    • /
    • pp.321-324
    • /
    • 2005
  • In these days, the management and visualization of 3D geo-spatial information is regarded as one of an important issue in GiS and remote sensing fields. 3D GIS is considered with the database issues such as handling and managing of 3D geometry/topology attributes, whereas 3D visualization is basically concerned with 3D computer graphics. This study focused on the design and implementation for the OpenGL API-based rendering system for the complex types of 3D geo-spatial features. In this approach 3D features can be separately processed with the functions of authoring and manipulation of terrain segments, building segments, road segments, and other geo-based things with texture mapping. Using this implementation, it is possible to the generation of an integrated scene with these complex types of 3D features. This integrated rendering system based on the feature-based 3D-GIS model can be extended and effectively applied to urban environment analysis, 3D virtual simulation and fly-by navigation in urban planning. Furthermore, we expect that 3D-GIS visualization application based on OpenGL API can be easily extended into a real-time mobile 3D-GIS system, soon after the release of OpenGLIES which stands for OpenGL for embedded system, though this topic is beyond the scope of this implementation.

  • PDF

Graphical display technology of internal impact in remote monitoring and simulation system (원격 모니터링 및 시뮬레이션 시스템의 내부 충격 그래픽 표시 기법)

  • Yoon, Ji-young;Lee, Hyo-jai;Woo, Deok-gun;Jang, Moon-su;Kim, Cheol-hwan
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.10a
    • /
    • pp.574-576
    • /
    • 2022
  • In this paper, we developed a technique to graphically display impact effects for remote monitoring or simulation systems. A remote monitoring or simulation system is being used to find a repair time or to prevent accidents while inspecting equipment or facilities in an industrial site in real time. These systems provide visual information to users so that they can analyze problem situations. The technique proposed in this paper is a method of modeling equipment and facilities using 3D graphics, and displaying the location of impact and damage occurring in the equipment inside using volume rendering. This technique has the advantage that the problem can be identified more accurately by displaying the impact animation by volume rendering at the location of the impact and damage inside the equipment. And it is expected that the problem situation can be identified more quickly through more intense visual effects.

  • PDF

Selection of a Remote Phosphor Configuration to Enhance the Color Quality of White LEDs

  • Anh, Nguyen Doan Quoc;Le, Phan Xuan;Lee, Hsiao-Yi
    • Current Optics and Photonics
    • /
    • v.3 no.1
    • /
    • pp.78-85
    • /
    • 2019
  • The remote phosphor structure has been proven to bear greater luminous efficiency than both the conformal phosphor and in-cup phosphor structures; however, controlling its color quality is much more challenging. To solve this dilemma, various researchers have proposed dual-layer phosphor and triple-layer phosphor configuration as techniques to enhance the display brightness of white LEDs (WLEDs). Likewise, this study picked one of these configurations to utilize in multichip WLEDs with five distinct color temperatures in the range from 5600 to 8500 K, for the purpose of improving the optical properties of WLEDs, such as color rendering index (CRI), color quality scale (CQS), luminous efficacy (LE), and chromatic homogeneity. According to the results of this research, the triple-layer phosphor configuration has superior performance compared to other configurations in terms of CRI, CQS, and LE, and yields higher chromatic stability for WLEDs.

A Prototype Implementation for 3D Animated Anaglyph Rendering of Multi-typed Urban Features using Standard OpenGL API

  • Lee, Ki-Won
    • Korean Journal of Remote Sensing
    • /
    • v.23 no.5
    • /
    • pp.401-408
    • /
    • 2007
  • Animated anaglyph is the most cost-effective method for 3D stereo visualization of virtual or actual 3D geo-based data model. Unlike 3D anaglyph scene generation using paired epipolar images, the main data sets of this study is the multi-typed 3D feature model containing 3D shaped objects, DEM and satellite imagery. For this purpose, a prototype implementation for 3D animated anaglyph using OpenGL API is carried out, and virtual 3D feature modeling is performed to demonstrate the applicability of this anaglyph approach. Although 3D features are not real objects in this stage, these can be substituted with actual 3D feature model with full texture images along all facades. Currently, it is regarded as the special viewing effect within 3D GIS application domains, because just stereo 3D viewing is a part of lots of GIS functionalities or remote sensing image processing modules. Animated anaglyph process can be linked with real-time manipulation process of 3D feature model and its database attributes in real world problem. As well, this approach of feature-based 3D animated anaglyph scheme is a bridging technology to further image-based 3D animated anaglyph rendering system, portable mobile 3D stereo viewing system or auto-stereo viewing system without glasses for multi-viewers.

Real-Time Haptic Rendering of Slowly Deformable Bodies Based on Two Dimensional Visual Information for Telemanipulation (원격조작을 위한 2차원 영상정보에 기반한 저속 변형체의 실시간 햅틱 렌더링)

  • Kim, Jung-Sik;Kim, Young-Jin;Kim, Jung
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.31 no.8
    • /
    • pp.855-861
    • /
    • 2007
  • Haptic rendering is a process providing force feedback during interactions between a user and a virtual object. This paper presents a real-time haptic rendering technique for deformable objects based on visual information of intervention between a tool and a real object in a remote place. A user can feel the artificial reaction force through a haptic device in real-time when a slave system exerts manipulation tasks on a deformable object. The models of the deformable object and the manipulator are created from the captured image obtained with a CCD camera and the recognition of objects is achieved using image processing techniques. The force at a rate of 1 kHz for stable haptic interaction is deduced using extrapolation of forces at a low update rate. The rendering algorithm developed was tested and validated on a test platform consisting of a one-dimensional indentation device and an off-the shelf force feedback device. This software system can be used in a cellular manipulation system providing artificial force feedback to enhance a success rate of operations.

Physically-based Haptic Rendering of a Deformable Object Using Two Dimensional Visual Information for Teleoperation (원격조작을 위한 이차원 영상정보를 이용한 변형체의 물리적 모델 기반 햅틱 렌더링)

  • Kim, Jung-Sik;Kim, Jung
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02c
    • /
    • pp.19-24
    • /
    • 2008
  • This paper presents a physically-based haptic rendering algorithm for a deformable object based on visual information about the intervention between a tool and a real object in a remote place. The physically-based model of a deformable object is created from the mechanical properties of the object and the captured image obtained with a CCD camera. When a slave system exerts manipulation tasks on a deformable object, the reaction force for haptic rendering is computed using boundary element method. Snakes algorithm is used to obtain the geometry information of a deformable object. The proposed haptic rendering algorithm can provide haptic feedback to a user without using a force transducer in a teleoperation system.

  • PDF

A 2-Tier Server Architecture for Real-time Multiple Rendering (실시간 다중 렌더링을 위한 이중 서버 구조)

  • Lim, Choong-Gyoo
    • Journal of Korea Game Society
    • /
    • v.12 no.4
    • /
    • pp.13-22
    • /
    • 2012
  • The wide-spread use of the broadband Internet service makes the cloud computing-based gaming service possible. A game program is executed on a cloud node and its live image is fed into a remote user's display device via video streaming. The user's input is immediately transmitted and applied to the game. The minimization of the time to process remote user's input and transmit the live image back to the user and thus satisfying the requirement of instant responsiveness for gaming makes it possible. However, the cost to build its servers can be very expensive to provide high quality 3D games because a general purpose graphics system that cloud nodes are likely to have for the service supports a single 3D application at a time. Thus, the server must have a technology of 'realtime multiple rendering' to execute multiple 3D games simultaneously. This paper proposes a new architecture of 2-tier servers of clouds nodes of which one group executes multiple games and the other produces game's live images. It also performs a few experimentations to prove the feasibility of the new architecture.

Implementation of Real-time Interactive Ray Tracing on GPU (GPU 기반의 실시간 인터렉티브 광선추적법 구현)

  • Bae, Sung-Min;Hong, Hyun-Ki
    • Journal of Korea Game Society
    • /
    • v.7 no.3
    • /
    • pp.59-66
    • /
    • 2007
  • Ray tracing is one of the classical global illumination methods to generate a photo-realistic rendering image with various lighting effects such as reflection and refraction. However, there are some restrictions on real-time applications because of its computation load. In order to overcome these limitations, many researches of the ray tracing based on GPU (Graphics Processing Unit) have been presented up to now. In this paper, we implement the ray tracing algorithm by J. Purcell and combine it with two methods in order to improve the rendering performance for interactive applications. First, intersection points of the primary ray are determined efficiently using rasterization on graphics hardware. We then construct the acceleration structure of 3D objects to improve the rendering performance. There are few researches on a detail analysis of improved performance by these considerations in ray tracing rendering. We compare the rendering system with environment mapping based on GPU and implement the wireless remote rendering system. This system is useful for interactive applications such as the realtime composition, augmented reality and virtual reality.

  • PDF

Red-emitting α-SrO·3B2O3:Sm2+ Phosphor for WLED Lamps: Novel Lighting Properties with Two-layer Remote Phosphor Package

  • Tin, Phu Tran;Nguyen, Nhan K.H.;Tran, Minh Q.H.;Lee, Hsiao-Yi
    • Current Optics and Photonics
    • /
    • v.1 no.4
    • /
    • pp.389-395
    • /
    • 2017
  • This paper investigates a method to improve the lighting performance of white light-emitting diodes (WLEDs), which are packaged using two separate remote phosphor layers, a yellow-emitting YAG:Ce phosphor layer and a red-emitting ${\alpha}-SrO{\cdot}3B_2O_3:Sm^{2+}$ phosphor layer. The thicknesses of these two layers are $800{\mu}m$ and $200{\mu}m$, respectively. Both of them are examined in conditions where the average correlated color temperatures (CCT) are 7700 K and 8500 K. For this two-layer model, the concentration of red phosphor is varied from 2% to 30% in the upper layer, while in the lower layer the yellow phosphor concentration is kept at 15%. It was found interestingly that the lighting properties such as color rendering index (CRI) and luminous flux are enhanced significantly, while the color uniformity is maintained in a relatively close range to the one of one-layer configuration (measured at the same correlated color temperature). Besides, the transmitted and reflected light of each phosphor layer are revised by combining Kubelka-Munk and Mie-Lorenz theories. Through analysis, it is demonstrated that the packaging configuration of two-layer remote phosphor that employs red-emitting ${\alpha}-SrO{\cdot}3B_2O_3:Sm^{2+}$ phosphor particles provides a practical solution for general WLEDs lighting.