• Title/Summary/Keyword: Virtual Shooting

Search Result 46, Processing Time 0.024 seconds

A Study on Trainer and Cover Recognition Algorithm for Posture Recognition of Virtual Shooting Trainer (가상 사격 훈련자 자세인식을 위한 훈련자와 엄폐물 인식 알고리즘 연구)

  • Kim, Hyung-O;Hong, ChangHo;Cho, Sung Ho;Park, Youster
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2021.05a
    • /
    • pp.298-300
    • /
    • 2021
  • The Ministry of National Defense decided to build a realistic combat simulation training system based on virtual reality and augmented reality in accordance with the expansion of the scientific training system of "Defense Reform 2.0". The realistic combat simulation training system should be able to maximize the tension and training effect as in actual combat through engagement between trainers. In addition, it should be possible to increase the effectiveness of survival training at the same time as shooting training similar to actual combat through cover training. Previous studies are suitable techniques to improve the shooting precision of the trainee, but it is difficult to practice bilateral engagement like in actual combat, and it is particularly insufficient for combat shooting training using cover. Therefore, in this paper, we propose a S/W algorithm for generating a virtual avatar by recognizing the shooting posture of the opponent on the screen of the virtual shooting trainer. This S/W algorithm can recognize the trainer and the cover based on the depth information acquired through the depth sensor and estimate the trainer's posture.

  • PDF

A study on the actual precision shooting training based on virtual reality (가상현실 기반 실전적 정밀사격훈련 구현 연구)

  • Lee, Byounghwak;Kim, Jonghwan;Shin, Kyuyoung;Kim, Dongwook;Lee, Wonwoo;Kim, Namhyuk
    • Convergence Security Journal
    • /
    • v.18 no.4
    • /
    • pp.62-71
    • /
    • 2018
  • The rapid growth of virtual reality technology in the era of the 4th Industrial Revolution has accelerated scientification of combat training systems in addition to ICT(information and communications technology) in military field. Recently, research and development of simulators based on virtual reality have been actively conducted in order to solve sensitive issues such as increase of civil complaints due to the noise of a shooting range, prevention of shooting accident, and reduction of training cost. In this paper, we propose two key solutions: spatial synchronization method and modified point mass trajectory model with small angle approximation to overcome technical limitations of a current training simulator. A trainee who wears a haptic vest in a mixed reality environment built in MARS(medium-range assault rifle shooting simulator) is able to conduct not only precision shooting but also two-way engagement with virtual opponents. It is possible for trainee to receive more reliable evaluations in the MARS than an existing rifle simulator based on laser.

  • PDF

A Study for Virtual Reality 360 Video Production Workflow with HDR Using log Shooting (log 촬영과 HDR을 이용한 실사 360 영상 제작흐름 연구)

  • Kim, Chulhyun
    • Journal of Broadcast Engineering
    • /
    • v.23 no.1
    • /
    • pp.63-73
    • /
    • 2018
  • These days, VR contents are created in three ways: CG based method, game engine based method, and live action shooting method. The most universal method is live action shooting. So far, most live actions are shot with action cams. Therefore, this method is different from professional image production method for movies and TV dramas. This study tries to point out the difference between professional image production method and action cam based shooting method, and proposes an alternative. The proposed method is log shooting based HDR filming and editing. As the result of test shooting and editing, the proposed method was able to obtain more color information than conventional action cam based shooting method and thereby to implement high-definition images which are hard in an action cam.

A Study on Implementation of Motion Graphics Virtual Camera with AR Core

  • Jung, Jin-Bum;Lee, Jae-Soo;Lee, Seung-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.27 no.8
    • /
    • pp.85-90
    • /
    • 2022
  • In this study, to reduce the time and cost disadvantages of the traditional motion graphic production method in order to realize the movement of a virtual camera identical to that of the real camera, motion graphics virtualization using AR Core-based mobile device real-time tracking data A method for creating a camera is proposed. The proposed method is a method that simplifies the tracking operation in the video file stored after shooting, and simultaneously proceeds with shooting on an AR Core-based mobile device to determine whether or not tracking is successful in the shooting stage. As a result of the experiment, there was no difference in the motion graphic result image compared to the conventional method, but the time of 6 minutes and 10 seconds was consumed based on the 300frame image, whereas the proposed method has very high time efficiency because this step can be omitted. At a time when interest in image production using virtual augmented reality and various studies are underway, this study will be utilized in virtual camera creation and match moving.

Application of Virtual Studio Technology and Digital Human Monocular Motion Capture Technology -Based on <Beast Town> as an Example-

  • YuanZi Sang;KiHong Kim;JuneSok Lee;JiChu Tang;GaoHe Zhang;ZhengRan Liu;QianRu Liu;ShiJie Sun;YuTing Wang;KaiXing Wang
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.16 no.1
    • /
    • pp.106-123
    • /
    • 2024
  • This article takes the talk show "Beast Town" as an example to introduce the overall technical solution, technical difficulties and countermeasures for the combination of cartoon virtual characters and virtual studio technology, providing reference and experience for the multi-scenario application of digital humans. Compared with the live broadcast that combines reality and reality, we have further upgraded our virtual production technology and digital human-driven technology, adopted industry-leading real-time virtual production technology and monocular camera driving technology, and launched a virtual cartoon character talk show - "Beast Town" to achieve real Perfectly combined with virtuality, it further enhances program immersion and audio-visual experience, and expands infinite boundaries for virtual manufacturing. In the talk show, motion capture shooting technology is used for final picture synthesis. The virtual scene needs to present dynamic effects, and at the same time realize the driving of the digital human and the movement with the push, pull and pan of the overall picture. This puts forward very high requirements for multi-party data synchronization, real-time driving of digital people, and synthetic picture rendering. We focus on issues such as virtual and real data docking and monocular camera motion capture effects. We combine camera outward tracking, multi-scene picture perspective, multi-machine rendering and other solutions to effectively solve picture linkage and rendering quality problems in a deeply immersive space environment. , presenting users with visual effects of linkage between digital people and live guests.

A Research on the Design and Implementation of LED Display-based Light Gun Systems

  • Byong-Kwon Lee
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.5
    • /
    • pp.85-91
    • /
    • 2024
  • With the current surge in leisure sports activities involving firearms and the costly shooting practices in the military, there's a growing interest in using virtual reality as a cost-effective alternative. This study proposes a system that addresses the drawbacks of existing shooting practice setups, such as dim spaces and high installation costs, by making it feasible on large display screens. The system integrates IR receivers and guns for practice, ensuring usability and efficiency through an application. Additionally, an accuracy adjustment feature enhances precise coordination recognition. As a result, this cyber light gun system offers an affordable solution for outdoor training.

A Design and Implementation of Virtual Reality Shooting Practice Game (가상현실 사격 연습 게임 설계 및 구현)

  • Park, Jin-yang;Lee, Sol
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2017.07a
    • /
    • pp.51-52
    • /
    • 2017
  • 본 논문에서는 사실적인 총기의 메커니즘을 반영한 가상현실(이하 VR) 사격 연습 게임의 설계 및 구현을 제안한다. 이 게임은 사용자가 VR 헤드셋을 착용하고 모션 컨트롤러를 이용하여 실제 총기를 손에 쥐고 발사하는 듯 한 현실감을 주기 위해 사실적인 그래픽과 주변 환경 뿐만 아니라, 총기의 반동이나 노리쇠의 움직임, 장전을 포함한 여러 가지 사실적인 총기 작동을 구현하고, 게임이라는 본질에 맞게 스코어 시스템과 같이 사용자의 흥미를 유발하는 요소를 개발하여 VR 응용 소프트웨어의 가능성을 최대한 활용한다.

  • PDF

Design of Gaming Interaction Control using Gesture Recognition and VR Control in FPS Game (FPS 게임에서 제스처 인식과 VR 컨트롤러를 이용한 게임 상호 작용 제어 설계)

  • Lee, Yong-Hwan;Ahn, Hyochang
    • Journal of the Semiconductor & Display Technology
    • /
    • v.18 no.4
    • /
    • pp.116-119
    • /
    • 2019
  • User interface/experience and realistic game manipulation play an important role in virtual reality first-person-shooting game. This paper presents an intuitive hands-free interface of gaming interaction scheme for FPS based on user's gesture recognition and VR controller. We focus on conventional interface of VR FPS interaction, and design the player interaction wearing head mounted display with two motion controllers; leap motion to handle low-level physics interaction and VIVE tracker to control movement of the player joints in the VR world. The FPS prototype system shows that the design interface helps to enjoy playing immersive FPS and gives players a new gaming experience.

A study on lighting angle for improvement of 360 degree video quality in metaverse (메타버스에서 360° 영상 품질향상을 위한 조명기 투사각연구)

  • Kim, Joon Ho;An, Kyong Sok;Choi, Seong Jhin
    • The Journal of the Convergence on Culture Technology
    • /
    • v.8 no.1
    • /
    • pp.499-505
    • /
    • 2022
  • Recently, the metaverse has been receiving a lot of attention. Metaverse means a virtual space, and various events can be held in this space. In particular, 360-degree video, a format optimized for the metaverse space, is attracting attention. A 360-degree video image is created by stitching images taken with multiple cameras or lenses in all 360-degree directions. When shooting a 360-degree video, a variety of shooting equipment, including a shooting staff to take a picture of a subject in front of the camera, is displayed on the video. Therefore, when shooting a 360-degree video, you have to hide everything except the subject around the camera. There are several problems with this shooting method. Among them, lighting is the biggest problem. This is because it is very difficult to install a fixture that focuses on the subject from behind the camera as in conventional image shooting. This study is an experimental study to find the optimal angle for 360-degree images by adjusting the angle of indoor lighting. We propose a method to record 360-degree video without installing additional lighting. Based on the results of this study, it is expected that experiments will be conducted through more various shooting angles in the future, and furthermore, it is expected that it will be helpful when using 360-degree images in the metaverse space.

Virtual Reality Image Shooting for Single Person Broadcasting with Multiple Smartphones

  • Budiman, Sutanto Edward;Lee, Suk-Ho
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.11 no.2
    • /
    • pp.43-49
    • /
    • 2019
  • Nowadays, one-person media broadcasting has become popular, and with the progress of this popularity, multimedia techniques which can support such broadcasting are also becoming more and more advanced. One of the most emerging multimedia technique used in this field is the virtual reality technology which sets the one-person media broadcasting environment as a virtual reality environment. However, as such an environment requires instruments of high cost, it is not easy for normal individuals to constitute such environments. Therefore, in this paper we propose how to construct virtual reality-like panoramas with a multiple of smartphones. For this purpose, we designed a special rig which can hold firmly 8 smartphone cameras which have overlapping view of the environment such that panorama stitching becomes possible. To reduce the computation cost, we precomputed the homography matrices, and used 1-D pointer structures to store the computed coordinate values.