• Title/Summary/Keyword: 3D Game Rendering Engine

Search Result 24, Processing Time 0.019 seconds

3D Clothes Modeling of Virtual Human for Metaverse (메타버스를 위한 가상 휴먼의 3차원 의상 모델링)

  • Kim, Hyun Woo;Kim, Dong Eon;Kim, Yujin;Park, In Kyu
    • Journal of Broadcast Engineering
    • /
    • v.27 no.5
    • /
    • pp.638-653
    • /
    • 2022
  • In this paper, we propose the new method of creating 3D virtual-human reflecting the pattern of clothes worn by the person in the high-resolution whole body front image and the body shape data about the person. To get the pattern of clothes, we proceed Instance Segmentation and clothes parsing using Cascade Mask R-CNN. After, we use Pix2Pix to blur the boundaries and estimate the background color and can get UV-Map of 3D clothes mesh proceeding UV-Map base warping. Also, we get the body shape data using SMPL-X and deform the original clothes and body mesh. With UV-Map of clothes and deformed clothes and body mesh, user finally can see the animation of 3D virtual-human reflecting user's appearance by rendering with the state-of-the game engine, i.e. Unreal Engine.

Method for 3D Visualization of Sound Data (사운드 데이터의 3D 시각화 방법)

  • Ko, Jae-Hyuk
    • Journal of Digital Convergence
    • /
    • v.14 no.7
    • /
    • pp.331-337
    • /
    • 2016
  • The purpose of this study is to provide a method to visualize the sound data to the three-dimensional image. The visualization of the sound data is performed according to the algorithm set after production of the text-based script that form the channel range of the sound data. The algorithm consists of a total of five levels, including setting sound channel range, setting picture frame for sound visualization, setting 3D image unit's property, extracting channel range of sound data and sound visualization, 3D visualization is performed with at least an operation signal input by the input device such as a mouse. With the sound files with the amount an animator can not finish in the normal way, 3D visualization method proposed in this study was highlighted that the low-cost, highly efficient way to produce creative artistic image by comparing the working time the animator with a study presented method and time for work. Future research will be the real-time visualization method of the sound data in a way that is going through a rendering process in the game engine.

The Study on Efficiency Analysis of 3D Animation Production Process Using Unreal Live Link for Autodesk Maya (언리얼 라이브 링크를 이용한 3D애니메이션 제작 공정의 효율성 분석 연구)

  • Chongsan Kwon;Si-min Kim
    • Journal of Industrial Convergence
    • /
    • v.21 no.9
    • /
    • pp.11-21
    • /
    • 2023
  • There have been many studies to improve the efficiency of the CG production process, but it was not easy to overcome the problem that it was difficult to check the result in the middle of work and it took a lot of time for rendering. However, as the possibility of using Unreal Live Link, which can check the result in real-time, is increasing, expectations for improving the efficiency of the production process are rising. This study analyzed the efficiency of the 3D animation production process using Unreal Live Link. To this end, modeling, rigging, animation, and layout work were done in Maya, and the final output image sequence was rendered in Unreal Engine through Unreal Live Link. And the difference between this production process and the existing production process in which the final output image sequence is rendered in the 3D software itself was compared and analyzed. As a result of the analysis, unlike the traditional 3D animation production process, it was possible to check the final work result in real-time by proceeding with the work through a high-quality viewport screen, and it was found that the efficiency of work was maximized by deriving the final result through real-time screen capture. Recently, the use of game engines in the 3D animation and film industry is gradually increasing, and the efficiency of work is expected to be maximized if Unreal Live Link is used.

A study on the effect of introducing EBS AR production system on content (EBS AR 실감영상 제작 시스템 도입이 콘텐츠에 끼친 영향에 대한 연구)

  • Kim, Ho-sik;Kwon, Soon-chul;Lee, Seung-hyun
    • The Journal of the Convergence on Culture Technology
    • /
    • v.7 no.4
    • /
    • pp.711-719
    • /
    • 2021
  • EBS has been producing numerous educational contents with traditional virtual studio production systems since the early 2000s and applied AR video production system in October 2020, twenty-years after. Although the basic concept of synthesizing graphic elements and actual image in real time by tracking camera movement and lens information is similar to the previous one but the newly applied AR video production system contains some of advanced technologies that are improved over the previous ones. Marker tracking technology that enables camera movement free and position tracking has been applied that can track the location stably, and the operating software has been applied with Unreal Engine, one of the representative graphic engines used in computer game production, therefore the system's rendering burden has been reduced, enabling high-quality and real-time graphic effects. This system is installed on a crane camera that is mainly used in a crane shot at the live broadcasting studio and applied for live broadcasting programs for children and some of the videos such as program introductions and quiz events that used to be expressed in 2D graphics were converted to 3D AR videos which has been enhanced. This paper covers the effect of introduction and application of the AR video production system on EBS content production and the future development direction and possibility.