• Title/Summary/Keyword: Head mounted display (HMD)

Search Result 202, Processing Time 0.026 seconds

A Study on Core Factors and Application of Asymmetric VR Content (Asymmetric VR 콘텐츠 제작의 핵심 요인과 활용에 관한 연구)

  • Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.23 no.5
    • /
    • pp.39-49
    • /
    • 2017
  • In this study, we propose the core factors and application of asymmetric virtual reality(VR) content in which head-mounted display(HMD) user and Non-HMD users can work together in a co-located space that can lead to various experiences and high presence. The core of the proposed asymmetric VR content is that all users are immersed in VR and participate in new experiences by reflecting widely a range of users' participation and environments, regardless of whether or not users wear the HMD. For this purpose, this study defines the role relationships between HMD user and Non-HMD users, the viewpoints provided to users, and the speech communication structure available among users. Based on this, we verified the core factors through the process of producing assistive asymmetric VR content and cooperative asymmetric VR content directly. Finally, we conducted a survey to examine the users' presence and their experience of the proposed asymmetric VR content and to analyze the application method. As a result, it was confirmed that if the purpose of asymmetric VR content and core factors between the two types of users are clearly distinguished and defined, the independent experience presented by the VR content together with perceived presence can provide a satisfactory experience to all users.

Auditory Spatial Arrangement of Object's Position in Virtual and Augmented Environment (가상환경에서의 위치정보 제시를 위한 청각적 공간배열)

  • Lee, Ju-Hwan
    • Journal of Advanced Navigation Technology
    • /
    • v.15 no.2
    • /
    • pp.326-333
    • /
    • 2011
  • In the present study, we measured the performance (accuracy and reaction time) of the user in the virtual environment with see-through Head-Mounted Display system that includes 3D sound generated through Head-Related Transfer Function (HRTF) to investigate the feasibility of auditory display for a certain object's spatial information. To sum up the results of two experiments, when presenting location information of the object with 3D sound, it is desirable that information arrangement from the user should be an orthogonal pattern which is located with right angle, not a diagonal pattern. Like these results propose that spatial information presentation with 3D sound make the optimal object arrangement of virtual environment possible.

Implementation of Random Controlling of Convergence Point in VR Image Content Production (VR 영상콘텐츠 제작을 위한 컨버전스 포인트 임의조절 구현)

  • Jin, Hyung Woo;Baek, Gwang Ho;Kim, Mijin
    • Smart Media Journal
    • /
    • v.4 no.4
    • /
    • pp.111-119
    • /
    • 2015
  • As a variety of HMD(Head Mounted Display) has come out, the production of 3D images onto which VR(Virtual Reality) technologies are grafted has been contributed to activating the production of image contents depending on a tangible or immersing type. VR-based image contents have enlarged their applicability across the entertainment industry from animation and game to realistic images. At the same time, the solution development for producing VR image contents has also gained elasticity. However, among those production solutions which have been used until now, fixed stereo camera based photographing has a limit that the binocular disparity of a user is fixed. This does not only restrict a way of expression a producer intends to direct, but also may cause the effect of 3D or space not to be sensed enough as view condition is not considered enough in a user's side. This study is aimed at resolving with skills applying in the latter part of 3D image production the problem that convergence points may be adjusted with restriction, which tends to happen at the time of the production of VR image contents. The later stage of the 3D imaging work analyzes and applies to game engines the significance of adjusting convergence points through the visualization of binocular disparity so that it is available to implement a function that the points could be controlled at random by a user.

Virtual Reality Using X3DOM (X3DOM을 이용한 가상현실)

  • Chheang, Vuthea;Ryu, Ga-Ae;Jeong, Sangkwon;Lee, Gookhwan;Yoo, Kwan-Hee
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.54 no.1
    • /
    • pp.165-170
    • /
    • 2017
  • Web 3D technology can be used to simulate the experiments of scientific, medical, engineering and multimedia visualization. On the web environment, 3D virtual reality can be accessed well without strictly on operating system, location and time. Virtual Reality (VR) is used to depict a three-dimensional, computer generated realistic images, sound and other sensations to replicated a real environment or an imaginary setting which can be explored and interacted with by a person. That person is immersed within virtual environment and is able to manipulate objects or perform a series of action. Virtual environment can be created with X3D which is the ISO standard for defining 3D interactive, web-based 3D content and integrating with multimedia. In this paper, we discuss about X3D VR stereo rendering scene and propose new X3D nodes for the HMD VR (head mounted display virtual reality). The proposed nodes are visualized by the web browser X3DOM of X3D.

3D View Controlling by Using Eye Gaze Tracking in First Person Shooting Game (1 인칭 슈팅 게임에서 눈동자 시선 추적에 의한 3차원 화면 조정)

  • Lee, Eui-Chul;Cho, Yong-Joo;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.10
    • /
    • pp.1293-1305
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMD. The proposed method is composed of 3 parts. In the first fart, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, the geometric relationship is determined between the monitor gazing position and the detected eye position gazing at the monitor position. In the last fart, the final gaze position on the HMB monitor is tracked and the 3D view in game is control]ed by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his (or her) hand. Also, it can increase the interest and immersion by synchronizing the gaze direction of game player and that of game character.

  • PDF

3D First Person Shooting Game by Using Eye Gaze Tracking (눈동자 시선 추적에 의한 3차원 1인칭 슈팅 게임)

  • Lee, Eui-Chul;Park, Kang-Ryoung
    • The KIPS Transactions:PartB
    • /
    • v.12B no.4 s.100
    • /
    • pp.465-472
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMB. The proposed method is composed of 3 parts. At first, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, when the user gaze on the monitor plane, the geometric relationship between the gazing position of monitor and the detected position of pupil center is determined. In the last part, the final gaze position on the HMD monitor is tracked and the 3D view in game is controlled by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his(or her) hand. Also, it can Increase the interest and the immersion by synchronizing the gaze direction of game player and the view direction of game character.

A Study on the Comparison of the Virtual Reality Development Environment in Unity and Unreal Engine 4 (유니티와 언리얼 엔진 4 에서의 가상현실 개발환경에 관한 비교연구)

  • Yunsik, Cho;Jinmo, Kim
    • Journal of the Korea Computer Graphics Society
    • /
    • v.28 no.5
    • /
    • pp.1-11
    • /
    • 2022
  • Game engines have the advantage of enabling efficient content production, such as reducing development time, with minimal visual quality guarantees and support for multi-platforms. Recently, game engines have provided various functions that can easily, quickly, and effectively produce immersive content using virtual reality (VR) HMD. Therefore, this study conducts a comparative study on the development environment in VR content production using Oculus Quest 2 HMD, focusing on Unity and Unreal game engines, which are widely used in the content production industry, including games. First, we compare the basic setup process of building a development environment using Oculus Quest 2 HMD and a dedicated controller based on a VR template project that includes the minimum functions and settings provided by each engine. Next, we present a simple experience environment that can interact in a virtual environment and compare the development environment to use a dedicated controller and the process of building a development environment that directly utilizes hands through the hand tracking function provided by Oculus Quest 2. Through this process, we will understand the basic process of building a VR development environment, and at the same time, we will check the characteristics and differences of the engine and use it as a research that can be applied to various immersive content production.

A Study on effective directive technique of 3D animation in Virtual Reality -Focus on Interactive short using 3D Animation making of Unreal Engine- (가상현실에서 효과적인 3차원 영상 연출을 위한 연구 -언리얼 엔진의 영상 제작을 이용한 인터렉티브 쇼트 중심으로-)

  • Lee, Jun-soo
    • Cartoon and Animation Studies
    • /
    • s.47
    • /
    • pp.1-29
    • /
    • 2017
  • 360-degree virtual reality has been a technology that has been available for a long time and has been actively promoted worldwide in recent years due to development of devices such as HMD (Head Mounted Display) and development of hardware for controlling and executing images of virtual reality. The production of the 360 degree VR requires a different mode of production than the traditional video production, and the matters to be considered for the user have begun to appear. Since the virtual reality image is aimed at a platform that requires enthusiasm, presence and interaction, it is necessary to have a suitable cinematography. In VR, users can freely enjoy the world created by the director and have the advantage of being able to concentrate on his interests during playing the image. However, the director had to develope and install the device what the observer could concentrate on the narrative progression and images to be delivered. Among the various methods of transmitting images, the director can use the composition of the short. In this paper, we will study how to effectively apply the technique of directing through the composition of this shot to 360 degrees virtual reality. Currently, there are no killer contents that are still dominant in the world, including inside and outside the country. In this situation, the potential of virtual reality is recognized and various images are produced. So the way of production follows the traditional image production method, and the shot composition is the same. However, in the 360 degree virtual reality, the use of the long take or blocking technique of the conventional third person view point is used as the main production configuration, and the limit of the short configuration is felt. In addition, while the viewer can interactively view the 360-degree screen using the HMD tracking, the configuration of the shot and the connection of the shot are absolutely dependent on the director like the existing cinematography. In this study, I tried to study whether the viewer can freely change the cinematography such as the composition of the shot at a user's desired time using the feature of interaction of the VR image. To do this, 3D animation was created using a game tool called Unreal Engine to construct an interactive image. Using visual scripting of Unreal Engine called blueprint, we create a device that distinguishes the true and false condition of a condition with a trigger node, which makes a variety of shorts. Through this, various direction techniques are developed and related research is expected, and it is expected to help the development of 360 degree VR image.

An Intravenous Injection Simulator using Augmented Reality for Veterinary Education (증강현실 기술을 사용한 수의학 교육용 정맥 주사 훈련 시뮬레이터)

  • Lee, Jun;Seo, Anna;Kim, WonJong;Kim, Jee-In;Lee, SeungYeon;Eom, KiDong
    • Journal of the HCI Society of Korea
    • /
    • v.7 no.2
    • /
    • pp.25-34
    • /
    • 2012
  • A veterinary student learns and experiences veterinary processes though experiments and practices using real animals. However, animal protection laws regulate animal experiments and restrict number of the experiments on laboratory animals, veterinary students would have less chances of the experiments and the practices for their veterinary training with real animals. This paper proposes a simulator for veterinary education based on augmented reality (AR). We selected an intravenous injection procedure for the simulation because the injection procedure is the most frequently used procedure during veterinary training and the most difficult stage for beginning veterinary students. The proposed AR simulator provides with a tangible prop, of which shape looks like a leg of a real dog. It also has a injection simulator, which receives user's input and sends force feedbacks to indicate results of the injection simulation. We developed a WorkBench type AR system with an LED display and cameras for visual information processing. Finally, we evaluated its performance through experiments and user studies to check its acceptance level and usability of the proposed system. We compared the proposed system with a traditional video based education and an AR based system using a head mounded display (HMD). The results that the proposed system showed better performances over these systems.

  • PDF

Systemic Development of Tele-Robotic Interface for the Hot-Line Maintenance (활선 작업을 위한 원격 조종 인터페이스 개발)

  • Kim Min-Soeng;Lee Ju-Jang;Kim Chang-Hyun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.10 no.12
    • /
    • pp.1217-1222
    • /
    • 2004
  • This paper describes the development of tele-robotic interface for the hot-line maintenance robot system. One of main issues in designing human-robot interface for the hot-line maintenance robot system is to plan the control procedure for each part of the robotic system. Another issue is that the actual degree of freedom (DOF) in the hot-line maintenance robot system is much greater than that of available control devices such as joysticks and gloves in the remote-cabin. For this purpose, a virtual simulator, which includes the virtual hot-line maintenance robot system and the environment, is developed in the 3D environment using CAD data. It is assumed that the control operation is done in the remote cabin and the overall work process is observed using the main-camera with 2 DOFs. For the input device, two joysticks, one pedal, two data gloves, and a Head Mounted Display (HMD) with tracker sensor were used. The interface is developed for each control mode. Designed human-interface system is operated using high-level control commands which are intuitive and easy to understand without any special training.