• Title/Summary/Keyword: Immersive virtual environment

Search Result 125, Processing Time 0.027 seconds

Enhancing Construction Jobsite Safety Training Through iSAFE-Education: A Novel Approach Using Industry 4.0 Technologies

  • Mehrtash SOLTANI;Akeem PEDRO;Rahat HUSSEIN;Si Van TIEN TRAN;Aqsa SABIR;Doyeop Lee;Chansik PARK
    • International conference on construction engineering and project management
    • /
    • 2024.07a
    • /
    • pp.1212-1219
    • /
    • 2024
  • Traditional safety training methods in construction, such as toolbox meetings and classroom sessions, fall short of addressing the challenges faced by workers, especially migrant workers hindered by barriers. The compliance-driven nature of safety management practices is identified as a core issue, often disconnected from the realities of the jobsite. To tackle these limitations, we propose iSAFE-Education, an approach harnessing Industry 4.0 technologies. By integrating Virtual Reality (VR) and Metaverse environments into safety training, this platform immerses workers in authentic jobsite contexts using 360-degree panorama images. Our method provides virtual walkthroughs, enabling workers to familiarize themselves with site-specific features and safety protocols. Additionally, iSAFE-Education facilitates site assessment and safety information sharing among project participants within the immersive jobsite environment. This paper highlights the importance of hazard information delivery and positions this solution as an answer for enhancing jobsite safety in contemporary construction settings.

A Real Time 6 DoF Spatial Audio Rendering System based on MPEG-I AEP (MPEG-I AEP 기반 실시간 6 자유도 공간음향 렌더링 시스템)

  • Kyeongok Kang;Jae-hyoun Yoo;Daeyoung Jang;Yong Ju Lee;Taejin Lee
    • Journal of Broadcast Engineering
    • /
    • v.28 no.2
    • /
    • pp.213-229
    • /
    • 2023
  • In this paper, we introduce a spatial sound rendering system that provides 6DoF spatial sound in real time in response to the movement of a listener located in a virtual environment. This system was implemented using MPEG-I AEP as a development environment for the CfP response of MPEG-I Immersive Audio and consists of an encoder and a renderer including a decoder. The encoder serves to offline encode metadata such as the spatial audio parameters of the virtual space scene included in EIF and the directivity information of the sound source provided in the SOFA file and deliver them to the bitstream. The renderer receives the transmitted bitstream and performs 6DoF spatial sound rendering in real time according to the position of the listener. The main spatial sound processing technologies applied to the rendering system include sound source effect and obstacle effect, and other ones for the system processing include Doppler effect, sound field effect and etc. The results of self-subjective evaluation of the developed system are introduced.

Real-time VR Strategy Chess Game using Motion Recognition (VR기반 모션인식을 이용한 실시간 전략 체스 게임)

  • Kim, Young-Kwang;Yoon, Yeo-Song;Oh, Tea-Gyeoung;HwangBo, Yeung-Hwan;Hwang, Jeong-Hee
    • Journal of Digital Contents Society
    • /
    • v.18 no.1
    • /
    • pp.1-7
    • /
    • 2017
  • Virtual reality(VR) is known as immersive multimedia or computer-simulated reality, is a computer technology that replicates an environment, real or imagined, and simulates a user's physical presence and environment to allow for user interaction. Virtual realities artificially create sensory experience including sight, touch, hearing, and smell. Owing to the use of a single device in most VR contents, user have difficulty in manipulating user interface and game object. And also immersion of the game goes down because they can't see the mouse and keyboard in virtual space. In this paper, we design and implement the chess game to easily and accurately control user interface to improve the immersion in game.

Study on Virtual Reality (VR) Operating System Prototype (가상환경(VR) 운영체제 프로토타입 연구)

  • Kim, Eunsol;Kim, Jiyeon;Yoo, Eunjin;Park, Taejung
    • Journal of Broadcast Engineering
    • /
    • v.22 no.1
    • /
    • pp.87-94
    • /
    • 2017
  • This paper presents a prototype for virtual reality operating system (VR OS) concept with head mount display (HMD) and hand gesture recognition technology based on game engine (Unity3D). We have designed and implemented simple multitasking thread mechanism constructed on the realtime environment provided by Unity3D game engine. Our virtual reality operating system receives user input from the hand gesture recognition device (Leap Motion) to simulate mouse and keyboard and provides output via head mount display (Oculus Rift DK2). As a result, our system provides users with more broad and immersive work environment by implementing 360 degree work space.

The development of training platform for CiADS using cave automatic virtual environment

  • Jin-Yang Li ;Jun-Liang Du ;Long Gu ;You-Peng Zhang;Xin Sheng ;Cong Lin ;Yongquan Wang
    • Nuclear Engineering and Technology
    • /
    • v.55 no.7
    • /
    • pp.2656-2661
    • /
    • 2023
  • The project of China initiative Accelerator Driven Subcritical (CiADS) system has been started to construct in southeast China's Guangdong province since 2019, which is expected to be checked and accepted in the year 2025. In order to make the students in University of Chinese Academy of Sciences (UCAS) better understand the main characteristic and the operation condition in the subcritical nuclear facility, the training platform for CiADS has been developed based on the Cave Automatic Virtual Environment (CAVE) in the Institute of Modern Physics Chinese Academy of Sciences (IMPCAS). The CAVE platform is a kind of non-head mounted virtual reality display system, which can provide the immersive experience and the alternative training platform to substitute the dangerous operation experiments with strong radioactivity. In this paper, the CAVE platform for the training scenarios in CiADS system has been presented with real-time simulation feature, where the required devices to generate the auditory and visual senses with the interactive mode have been detailed. Moreover, the three dimensional modeling database has been created for the different operation conditions, which can bring more freedom for the teachers to generate the appropriate training courses for the students. All the user-friendly features will offer a deep realistic impression to the students for the purpose of getting the required knowledge and experience without the large costs in the traditional experimental nuclear reactor.

Development of an Interactive Virtual Reality Service based on 360 degree VR Image (360도 파노라마 영상 기반 대화형 가상현실 서비스 구축)

  • Kang, Byoung-Gil;Ryu, Seuc-Ho;Lee, Wan-Bok
    • Journal of Digital Convergence
    • /
    • v.15 no.11
    • /
    • pp.463-470
    • /
    • 2017
  • Currently, virtual reality contents using VR images are spotlighted since they can be easily created and utilized. But because VR images are in a state of lack of interaction, there are limitations in their applications and usability.In order to overcome this problem, we propose a new method in which 360 degree panorama image and game engine are utilized to develop a high resolution of interactive VR service in real time. In particular, since the background image, which is represented by a form of panorama image, is pre-generated through a heavy rendering computation, it can be used to provide a immersive VR service with a relatively small amount of computation in run time on a low performance device. In order to show the effectiveness of our proposed method, an interactive game of a virtual zoo environment was implemented and illustrated showing that it can improve user interaction and immersion experience in a pretty good way.

User Evaluation of Encountered Type Haptic System with Visual-Haptic Co-location (시각 - 촉각 일치된 마중형 햅틱 제시 시스템의 사용자 평가)

  • Cha, Baekdong;Bae, Yoosung;Choi, Wonil;Ryu, Jeha
    • Journal of the HCI Society of Korea
    • /
    • v.14 no.2
    • /
    • pp.13-20
    • /
    • 2019
  • For encountered haptic display systems among the virtual training systems for industrial safety, visual-haptic co-location is required for natural interaction between virtual and real objects. In this paper, we performed the user evaluation of the immersive VR haptic system which implement some level of visual-haptic co-location through a careful and accurate calibration method. The goal of the evaluation is to show that user performance (reaction time and distance accuracy) for both environments is not significantly different for certain tasks performed. The user evaluation results show statistically significant differences in reaction time but the absolute difference is less than 1 second. In the meantime, the distance accuracy shows no difference between the virtual and the actual environments. Therefore, it can be concluded that the developed haptic virtual training system can provide inexpensive industrial safety training in place of costly actual environment.

Visualization of Welded Connections based on Shader for Virtual Welding Training (가상현실 용접 훈련을 위한 쉐이더 기반 특수효과 표현)

  • Oh, Soobin;Jo, Dongsik
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2019.05a
    • /
    • pp.479-481
    • /
    • 2019
  • In recent years, training systems in various industrial fields have been made using virtual reality (VR) technology and widely used in education. Virtual reality based training system is safe, there is no waste of material, and there are many advantages to be able to practice anytime and anywhere. For example, virtual reality welding training simulation system is widely used for field worker because it can perform actual joining of steel plate in immersive environment. At this time, realistic representation of the steel plate joint is important to maximize the effectiveness of the training, but existing techniques have limited the natural expression of the effect. In this study, we propose a method of visualizing joint effect based on shader in order to construct welding training system. The results of this study can be applied to the welding training system to improve the weld training effect to provide the user with high-quality visualization.

  • PDF

Collaborative Visualization of Warfare Simulation using a Commercial Game Engine (상업용 게임 엔진을 활용한 전투 시뮬레이션 결과의 협업 가시화)

  • Kim, Hyungki;Kim, Junghoon;Kang, Yuna;Shin, Suchul;Kim, Imkyu;Han, Soonhung
    • Journal of the Korea Society for Simulation
    • /
    • v.22 no.4
    • /
    • pp.57-66
    • /
    • 2013
  • The needs for reusable 3D visualization tool has been being raised in various industries. Especially in the defense modeling and simulation (M&S) domain, there are abundant researches about reusable and interoperable visualization system, since it has a critical role to the efficient decision making by offering diverse validation and analyzing processes. To facilitate the effectiveness, states-of-the-arts M&S systems are applying VR (Virtual Reality) or AR (Augmented Reality) technologies. To reduce the work burden authors design a collaborative visualization environment based on a commercial game engine Unity3D. We define the requirements of the warfare simulation by analyzing pros and cons of existing tools and engines such as SIMDIS or Vega, and apply functionalities of the commercial game engine to satisfy the requirements. A prototype has been implemented as the collaborative visualization environment of iCAVE at KAIST, which is a facility for immersive virtual environment. The facility is intraoperative with smart devices.

Real-time 3D Audio Downmixing System based on Sound Rendering for the Immersive Sound of Mobile Virtual Reality Applications

  • Hong, Dukki;Kwon, Hyuck-Joo;Kim, Cheong Ghil;Park, Woo-Chan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.12
    • /
    • pp.5936-5954
    • /
    • 2018
  • Eight out of the top ten the largest technology companies in the world are involved in some way with the coming mobile VR revolution since Facebook acquired Oculus. This trend has allowed the technology related with mobile VR to achieve remarkable growth in both academic and industry. Therefore, the importance of reproducing the acoustic expression for users to experience more realistic is increasing because auditory cues can enhance the perception of the complicated surrounding environment without the visual system in VR. This paper presents a audio downmixing system for auralization based on hardware, a stage of sound rendering pipelines that can reproduce realiy-like sound but requires high computation costs. The proposed system is verified through an FPGA platform with the special focus on hardware architectural designs for low power and real-time. The results show that the proposed system on an FPGA can downmix maximum 5 sources in real-time rate (52 FPS), with 382 mW low power consumptions. Furthermore, the generated 3D sound with the proposed system was verified with satisfactory results of sound quality via the user evaluation.