• Title/Summary/Keyword: 360 Degree Video

Search Result 74, Processing Time 0.026 seconds

Real-Time Panoramic Video Streaming Technique with Multiple Virtual Cameras (다중 가상 카메라의 실시간 파노라마 비디오 스트리밍 기법)

  • Ok, Sooyol;Lee, Suk-Hwan
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.4
    • /
    • pp.538-549
    • /
    • 2021
  • In this paper, we introduce a technique for 360-degree panoramic video streaming with multiple virtual cameras in real-time. The proposed technique consists of generating 360-degree panoramic video data by ORB feature point detection, texture transformation, panoramic video data compression, and RTSP-based video streaming transmission. Especially, the generating process of 360-degree panoramic video data and texture transformation are accelerated by CUDA for complex processing such as camera calibration, stitching, blending, encoding. Our experiment evaluated the frames per second (fps) of the transmitted 360-degree panoramic video. Experimental results verified that our technique takes at least 30fps at 4K output resolution, which indicates that it can both generates and transmits 360-degree panoramic video data in real time.

Improve Compression Efficiency of 360degree VR Video by Correcting Perspective in Cubemap Projection (Cubemap Projection 360도 VR 비디오에서 시점 보정을 통한 압축 효율 향상 방법)

  • Yoon, Sung Jea;Park, Gwang Hoon
    • Journal of Broadcast Engineering
    • /
    • v.22 no.1
    • /
    • pp.136-139
    • /
    • 2017
  • Recently, many companies and consumers has shown a lot of interest toward VR(Virtual Reality), so many VR devices such as HMD(Head mounted Display) and 360 degree VR camera are released on the market. Current encoded 360 degree VR video uses the codec which originally made for the conventional 2D video. Therefore, the compression efficiency isn't optimized because the en/decoder does not consider the characteristics of the 360 degree VR video. In this paper, we propose a method to improve the compression efficiency by using the reference frame which compensates for the distortions caused by characteristics the 360 degree VR video. Applying the proposed method we were able to increase the compression efficiency by providing better prediction.

Activated Viewport based Surveillance Event Detection in 360-degree Video (360도 영상 공간에서 활성 뷰포트 기반 이벤트 검출)

  • Shim, Yoo-jeong;Lee, Myeong-jin
    • Journal of Broadcast Engineering
    • /
    • v.25 no.5
    • /
    • pp.770-775
    • /
    • 2020
  • Since 360-degree ERP frame structure has location-dependent distortion, existing video surveillance algorithms cannot be applied to 360-degree video. In this paper, an activated viewport based event detection method is proposed for 360-degree video. After extracting activated viewports enclosing object candidates, objects are finally detected in the viewports. These objects are tracked in 360-degree video space for region-based event detection. The proposed method is shown to improve the recall and the false negative rate more than 30% compared to the conventional method without activated viewports.

Quality of Experience Experiment Method and Statistical Analysis for 360-degree Video with Sensory Effect

  • Jin, Hoe-Yong;Kim, Sang-Kyun
    • Journal of Broadcast Engineering
    • /
    • v.25 no.7
    • /
    • pp.1063-1072
    • /
    • 2020
  • This paper proposes an experimental method for measuring the quality of experience to measure the influence of the participants' immersion, satisfaction, and presence according to the application of sensory effects to 360-degree video. Participants of the experiment watch 360-degree videos using HMD and receive sensory effects using scent diffusing devices and wind devices. Subsequently, a questionnaire was conducted on the degree of immersion, satisfaction, and present feelings for the video you watched. By analyzing the correlation of the survey results, we found that the provision of sensory effects satisfies the 360-degree video viewing, and the experimental method was appropriate. In addition, using the P.910 method, a result was derived that was not suitable for measuring the quality of the immersion and presence of 360-degree video according to the provision of sensory effects.

Performance Analysis of Viewport-dependent Tiled Streaming on 16K Ultra High-quality 360-degree Video (16K 초고화질 360도 영상에서의 사용자 시점 기반 타일 스트리밍 성능 검증)

  • Jeong, Jong-Beom;Lee, Soonbin;Kim, Inae;Ryu, Eun-Seok
    • Journal of Internet Computing and Services
    • /
    • v.22 no.3
    • /
    • pp.1-8
    • /
    • 2021
  • Ultra high-quality and ultra high-resolution omnidirectional 360-degree video streaming is needed to provide immersive media through head-mounted display(HMD) in virtual reality environment, which requires high bandwidth and computational complexity. One of the approaches avoiding these problems is to apply viewport-dependent selective streaming using tile-based segmentation method. This paper presents a performance analysis of viewport-dependent tiled streaming on 16K ultra high-quality 360-degree videos and 4K 360-degree videos which are widely used. Experimental results showed 42.47% of bjotegaard delta rate(BD-rate) saving on 16K ultra high-quality 360-degree video tiled streaming compared to viewport-independent streaming while 4K 360-degree video showed 26.41% of BD-rate saving. Therefore, this paper verified that tiled streaming is more efficient on ultra-high quality video.

Experiment Method for Measuring Quality of Experience for 360-degree Video with Sensorial Effects (360도 동영상 감각효과에 대한 사용자경험품질 측정 실험 방법)

  • Jeong, Min Hyuk;Kim, Sang Kyun
    • Journal of Broadcast Engineering
    • /
    • v.25 no.1
    • /
    • pp.113-116
    • /
    • 2020
  • This paper proposes a Quality of Experience (QoE) evaluation experiment to measure the effects of 360-degree video and sensory effects on the subject's degree of immersion, satisfaction, and sense of presence. The test subject responds to a questionnaire about the degree of immersion, satisfaction, and sense of presence after experiencing a 360-degree video accompanied by sensory effects while wearing a head-mounted display (HMD) with a scent diffusion device. As a result of the response analysis, it was confirmed that the proposed experimental method is suitable for measuring the subject's degree of immersion, satisfaction, and sense of presence about 360-degree video and sensory effects. On the other hand, inserting a gray screen for comparison experiments while watching a 360-degree video was found to cause a significant decrease in immersion and realism.

A study on lighting angle for improvement of 360 degree video quality in metaverse (메타버스에서 360° 영상 품질향상을 위한 조명기 투사각연구)

  • Kim, Joon Ho;An, Kyong Sok;Choi, Seong Jhin
    • The Journal of the Convergence on Culture Technology
    • /
    • v.8 no.1
    • /
    • pp.499-505
    • /
    • 2022
  • Recently, the metaverse has been receiving a lot of attention. Metaverse means a virtual space, and various events can be held in this space. In particular, 360-degree video, a format optimized for the metaverse space, is attracting attention. A 360-degree video image is created by stitching images taken with multiple cameras or lenses in all 360-degree directions. When shooting a 360-degree video, a variety of shooting equipment, including a shooting staff to take a picture of a subject in front of the camera, is displayed on the video. Therefore, when shooting a 360-degree video, you have to hide everything except the subject around the camera. There are several problems with this shooting method. Among them, lighting is the biggest problem. This is because it is very difficult to install a fixture that focuses on the subject from behind the camera as in conventional image shooting. This study is an experimental study to find the optimal angle for 360-degree images by adjusting the angle of indoor lighting. We propose a method to record 360-degree video without installing additional lighting. Based on the results of this study, it is expected that experiments will be conducted through more various shooting angles in the future, and furthermore, it is expected that it will be helpful when using 360-degree images in the metaverse space.

A Study on Projection Conversion for Efficient 3DoF+ 360-Degree Video Streaming

  • Jeong, Jong-Beom;Lee, Soonbin;Jang, Dongmin;Kim, Sungbin;Lee, Sangsoon;Ryu, Eun-Seok
    • Journal of Broadcast Engineering
    • /
    • v.24 no.7
    • /
    • pp.1209-1220
    • /
    • 2019
  • The demand for virtual reality (VR) is rapidly increasing. Providing the immersive experience requires much operation and many data to transmit. For example, a 360-degree video (360 video) with at least 4K resolution is needed to offer an immersive experience to users. Moreover, the MPEG-I group defined three degrees of freedom plus (3DoF+), and it requires the transmission of multiview 360 videos simultaneoulsy. This could be a burden for the VR streaming system. Accordingly, in this work, a bitrate-saving method using projection conversion is introduced, along with experimental results for streaming 3DoF+ 360 video. The results show that projection conversion of 360 video with 360lib shows a Bjontegaard delta bitrate gain of as much as 11.4%.

Implementing VVC Tile Extractor for 360-degree Video Streaming Using Motion-Constrained Tile Set

  • Jeong, Jong-Beom;Lee, Soonbin;Kim, Inae;Lee, Sangsoon;Ryu, Eun-Seok
    • Journal of Broadcast Engineering
    • /
    • v.25 no.7
    • /
    • pp.1073-1080
    • /
    • 2020
  • 360-degree video streaming technologies have been widely developed to provide immersive virtual reality (VR) experiences. However, high computational power and bandwidth are required to transmit and render high-quality 360-degree video through a head-mounted display (HMD). One way to overcome this problem is by transmitting high-quality viewport areas. This paper therefore proposes a motion-constrained tile set (MCTS)-based tile extractor for versatile video coding (VVC). The proposed extractor extracts high-quality viewport tiles, which are simulcasted with low-quality whole video to respond to unexpected movements by the user. The experimental results demonstrate a savings of 24.81% in the bjøntegaard delta rate (BD-rate) saving for the luma peak signal-to-noise ratio (PSNR) compared to the rate obtained using a VVC anchor without tiled streaming.

The Implementation of Information Providing Method System for Indoor Area by using the Immersive Media's Video Information (실감미디어 동영상정보를 이용한 실내 공간 정보 제공 시스템 구현)

  • Lee, Sangyoon;Ahn, Heuihak
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.12 no.3
    • /
    • pp.157-166
    • /
    • 2016
  • This paper presents the interior space information using 6D-360 degree immersive media video information. And we implement the augmented reality, which includes a variety of information such as position information, movement information of the specific location in the interior space GPS signal does not reach the position information. Augmented reality containing the 6D-360 degree immersive media video information provides the position information and the three dimensional space image information to identify the exact location of a user in an interior space of a moving object as well as a fixed interior space. This paper constitutes a three dimensional image database based on the 6D-360 degree immersive media video information and provides augmented reality service. Therefore, to map the various information to 6D-360 degree immersive media video information, the user can check the plant in the same environment as the actual. It suggests the augmented reality service for the emergency escape and repair to the passengers and employees.