• Title/Summary/Keyword: 360 degree video

Search Result 75, Processing Time 0.023 seconds

An Efficient Feature Point Extraction Method for 360˚ Realistic Media Utilizing High Resolution Characteristics

  • Won, Yu-Hyeon;Kim, Jin-Sung;Park, Byuong-Chan;Kim, Young-Mo;Kim, Seok-Yoon
    • Journal of the Korea Society of Computer and Information
    • /
    • v.24 no.1
    • /
    • pp.85-92
    • /
    • 2019
  • In this paper, we propose a efficient feature point extraction method that can solve the problem of performance degradation by introducing a preprocessing process when extracting feature points by utilizing the characteristics of 360-degree realistic media. 360-degree realistic media is composed of images produced by two or more cameras and this image combining process is accomplished by extracting feature points at the edges of each image and combining them into one image if they cover the same area. In this production process, however, the stitching process where images are combined into one piece can lead to the distortion of non-seamlessness. Since the realistic media of 4K-class image has higher resolution than that of a general image, the feature point extraction and matching process takes much more time than general media cases.

A Feature Point Recognition Ratio Improvement Method for Immersive Contents Using Deep Learning (딥 러닝을 이용한 실감형 콘텐츠 특징점 인식률 향상 방법)

  • Park, Byeongchan;Jang, Seyoung;Yoo, Injae;Lee, Jaechung;Kim, Seok-Yoon;Kim, Youngmo
    • Journal of IKEEE
    • /
    • v.24 no.2
    • /
    • pp.419-425
    • /
    • 2020
  • The market size of immersive 360-degree video contents, which are noted as one of the main technology of the fourth industry, increases every year. However, since most of the images are distributed through illegal distribution networks such as Torrent after the DRM gets lifted, the damage caused by illegal copying is also increasing. Although filtering technology is used as a technology to respond to these issues in 2D videos, most of those filtering technology has issues in that it has to overcome the technical limitation such as huge feature-point data volume and the related processing capacity due to ultra high resolution such as 4K UHD or higher in order to apply the existing technology to immersive 360° videos. To solve these problems, this paper proposes a feature-point recognition ratio improvement method for immersive 360-degree videos using deep learning technology.

An Atlas Generation Method with Tiny Blocks Removal for Efficient 3DoF+ Video Coding (효율적인 3DoF+ 비디오 부호화를 위한 작은 블록 제거를 통한 아틀라스 생성 기법)

  • Lim, Sung-Gyun;Kim, Hyun-Ho;Kim, Jae-Gon
    • Journal of Broadcast Engineering
    • /
    • v.25 no.5
    • /
    • pp.665-671
    • /
    • 2020
  • MPEG-I is actively working on standardization on the coding of immersive video which provides up to 6 degree of freedom (6DoF) in terms of viewpoint. 3DoF+ video, which provides motion parallax to omnidirectional view of 360 video, renders a view at any desired viewpoint using multiple view videos acquisitioned in a limited 3D space covered with upper body motion at a fixed position. The MPEG-I visual group is developing a test model called TMIV (Test Model for Immersive Video) in the process of development of the standard for 3DoF+ video coding. In the TMIV, the redundancy between a set of input view videos is removed, and several atlases are generated by packing patches including the remaining texture and depth regions into frames as compact as possible, and coded. This paper presents an atlas generation method that removes small-sized blocks in the atlas for more efficient 3DoF+ video coding. The proposed method shows a performance improvement of BD-rate bit savings of 0.7% and 1.4%, respectively, in natural and graphic sequences compared to TMIV.

Implementing 360-degree VR Video Streaming System Prototype for Large-scale Immersive Displays (대형 가상현실 공연장을 위한 360 도 비디오 스트리밍 시스템 프로토타입 구현)

  • Ryu, Yeongil;Choi, YiHyun;Ryu, Eun-Seok
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2022.06a
    • /
    • pp.1241-1244
    • /
    • 2022
  • 최근 K-Pop 을 위시한 예술공연 콘텐츠에 몰입형 미디어를 접목한 온택트 (Ontact) 미디어 스트리밍 서비스가 주목받고 있는 가운데, 본 논문은 일반적으로 사용되는 2D 디스플레이 또는 HMD (Head-Mounted Display) 기반 VR (Virtual Reality, VR) 서비스에서 탈피하여, 대형 가상현실 공연장을 위한 360 도 VR 비디오 스트리밍 시스템을 제안한다. 제안된 시스템은 Phase 1, 2, 3 의 연구개발 단계를 밟아 6DoF (Degrees of Freedom) 시점 자유도를 지원하는 360 도 VR 비디오 스트리밍 시스템을 개발하는 것을 최종목표로 하고 있으며, 현재는 Phase 1: 대형 가상현실 공연장을 위한 3DoF 360 도 VR 비디오 스트리밍 시스템 프로토타입의 개발까지 완료되었다. 구현된 스트리밍 시스템 프로토타입은 서브픽처 기반 Viewport-dependent 스트리밍 기술이 적용되어 있으며, 기존 방식과 비교하였을 때 약 80%의 비트율 감소, 약 543%의 영상 디코딩 속도 향상을 확인하였다. 또한, 단순 구현 및 성능평가에서 그치지 않고, 실제 미국 UCSB 에 위치한 대형 가상현실 공연장 AlloSphere 에서의 시범방송을 수행하여, 향후 Phase 2, 3 연구단계를 위한 연구적 기반을 마련하였다.

  • PDF

A Method of Patch Merging for Atlas Construction in 3DoF+ Video Coding

  • Im, Sung-Gyune;Kim, Hyun-Ho;Lee, Gwangsoon;Kim, Jae-Gon
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2019.11a
    • /
    • pp.259-260
    • /
    • 2019
  • MPEG-I Visual group is actively working on enhancing immersive experiences with up to six degree of freedom (6DoF). In virtual space of 3DoF+, which is defined as an extension of 360 video with limited changes of the view position in a sitting position, looking at the scene from another viewpoint (another position in space) requires rendering additional viewpoints using multiple videos taken at the different locations at the same time. In the MPEG-I Visual workgroup, methods of efficient coding and transmission of 3DoF+ video are being studied, and they released Test Model for Immersive Media (TMIV) recently. This paper presents the enhanced clustering method which can pack the patches into atlas efficiently in TMIV. The experimental results show that the proposed method achieves significant BD-rate reduction in terms of various end-to-end evaluation methods.

  • PDF

A Study for Virtual Reality 360 Video Production Workflow with HDR Using log Shooting (log 촬영과 HDR을 이용한 실사 360 영상 제작흐름 연구)

  • Kim, Chulhyun
    • Journal of Broadcast Engineering
    • /
    • v.23 no.1
    • /
    • pp.63-73
    • /
    • 2018
  • These days, VR contents are created in three ways: CG based method, game engine based method, and live action shooting method. The most universal method is live action shooting. So far, most live actions are shot with action cams. Therefore, this method is different from professional image production method for movies and TV dramas. This study tries to point out the difference between professional image production method and action cam based shooting method, and proposes an alternative. The proposed method is log shooting based HDR filming and editing. As the result of test shooting and editing, the proposed method was able to obtain more color information than conventional action cam based shooting method and thereby to implement high-definition images which are hard in an action cam.

A Study on Immersive 360-degree Video Application Metadata and Operating System for Interworking with UCI Standard Identification System (UCI 표준식별체계 연동을 위한 실감형 360도 영상 응용 메타데이터 및 운영 시스템에 관한 연구)

  • Park, Byeongchan;Jang, Seyoung;Ruziev, Ulugbek;Kim, Youngmo;Kim, Seok-Yoon
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2020.07a
    • /
    • pp.433-435
    • /
    • 2020
  • 본 논문에서 저작권 보호 기술 적용을 위해 실감형 360도 영상의 강인성 정보를 이용하여 UCI 운영을 위한 응용 메타데이터 요소를 제안한다. 오늘날 멀티미디어 콘텐츠의 산업의 규모가 비약적으로 커지고 있음에 따라 이를 효과적으로 관리 및 유통할 수 있는 콘텐츠 고유의 식별체계가 요구되고 있다. 현재 국내에서 운용 중인 대표 식별자는 정통부가 개발한 UCI가 활용되고 있다. UCI는 다양한 멀티미디어 콘텐츠를 효과적으로 관리 및 유통할 수 있으나 저작권 보호 기술에 직접적으로 연관이 되어 있지 않아 이를 보완할 수 있는 기술이 요구된다. 본 논문에서는 UCI와 직접으로 연동할 수 있는 실감형 360도 동영상 응용 메타데이터 요소 및 운영 방법을 제안하여 저작권 보호 기술을 적용할 수 있도록 한다.

  • PDF

A Study on effective directive technique of 3D animation in Virtual Reality -Focus on Interactive short using 3D Animation making of Unreal Engine- (가상현실에서 효과적인 3차원 영상 연출을 위한 연구 -언리얼 엔진의 영상 제작을 이용한 인터렉티브 쇼트 중심으로-)

  • Lee, Jun-soo
    • Cartoon and Animation Studies
    • /
    • s.47
    • /
    • pp.1-29
    • /
    • 2017
  • 360-degree virtual reality has been a technology that has been available for a long time and has been actively promoted worldwide in recent years due to development of devices such as HMD (Head Mounted Display) and development of hardware for controlling and executing images of virtual reality. The production of the 360 degree VR requires a different mode of production than the traditional video production, and the matters to be considered for the user have begun to appear. Since the virtual reality image is aimed at a platform that requires enthusiasm, presence and interaction, it is necessary to have a suitable cinematography. In VR, users can freely enjoy the world created by the director and have the advantage of being able to concentrate on his interests during playing the image. However, the director had to develope and install the device what the observer could concentrate on the narrative progression and images to be delivered. Among the various methods of transmitting images, the director can use the composition of the short. In this paper, we will study how to effectively apply the technique of directing through the composition of this shot to 360 degrees virtual reality. Currently, there are no killer contents that are still dominant in the world, including inside and outside the country. In this situation, the potential of virtual reality is recognized and various images are produced. So the way of production follows the traditional image production method, and the shot composition is the same. However, in the 360 degree virtual reality, the use of the long take or blocking technique of the conventional third person view point is used as the main production configuration, and the limit of the short configuration is felt. In addition, while the viewer can interactively view the 360-degree screen using the HMD tracking, the configuration of the shot and the connection of the shot are absolutely dependent on the director like the existing cinematography. In this study, I tried to study whether the viewer can freely change the cinematography such as the composition of the shot at a user's desired time using the feature of interaction of the VR image. To do this, 3D animation was created using a game tool called Unreal Engine to construct an interactive image. Using visual scripting of Unreal Engine called blueprint, we create a device that distinguishes the true and false condition of a condition with a trigger node, which makes a variety of shorts. Through this, various direction techniques are developed and related research is expected, and it is expected to help the development of 360 degree VR image.

Fast Pattern Tracking in Cubemap Video Using Kalman Filter (큐브맵 비디오에서 칼만 필터를 사용한 빠른 패턴 추적)

  • Kim, Ki-Sik;Park, Jong-Seung
    • Journal of Korea Game Society
    • /
    • v.20 no.6
    • /
    • pp.43-52
    • /
    • 2020
  • This paper presents a fast pattern tracking method using location prediction in cubemap video for 360-degree VR. A spherical cubemap frame has six face textures and searching a pattern is much slower than a flat image. To overcome the limitation, we propose a method of predicting the location of target pattern using Kalman filter and reducing the search area by considering only textures of predicted location. The experimental results showed that the proposed system is much faster than the previous method of searching all six faces and also gives accurate pattern tracking performance.

A Study of Direction of VR Animation <Goodbye Mr Octopus> (VR애니메이션 <Goodbye Mr Octopus> 연출 연구)

  • Lee, TaeGu;Park, Sukyung
    • The Journal of the Convergence on Culture Technology
    • /
    • v.9 no.1
    • /
    • pp.135-140
    • /
    • 2023
  • VR animation allows you to see 360-degree screen production that cannot be seen in existing animation production while being located in the space within the animation. <Goodbye Mr Octopus>, a VR animation produced in 2020, was selected at the 77th Venice Film Festival as an immersive short film. This is the story of an adolescent girl, Stella, who is celebrating her 16th birthday, and her conflict with her strict father is resolved through a letter from her mother. It is a narrative composed of a total of 11 scenes, and in each scene, new directing elements of VR video grammar, such as gaze induction, time flow, and space conversion directing, were analyzed. Gaze-inducing directing minimized the inconvenience of 360-degree gaze, and time and space conversion directing was analyzed as an effect of increasing the audience's immersion according to narrative events.