Browse > Article
http://dx.doi.org/10.5392/IJoC.2020.16.4.068

A Study on "A Midsummer Night's Palace" Using VR Sound Engineering Technology  

Seok, MooHyun (Chung-Ang University Art & technology)
Kim, HyungGi (Chung-Ang University Art & technology)
Publication Information
Abstract
VR (Virtual Reality) contents make the audience perceive virtual space as real through the virtual Z axis which creates a space that could not be created in 2D due to the space between the eyes of the audience. This visual change has led to the need for technological changes to sound and sound sources inserted into VR contents. However, studies to increase immersion in VR contents are still more focused on scientific and visual fields. This is because composing and producing VR sounds require professional views in two areas: sound-based engineering and computer-based interactive sound engineering. Sound-based engineering is difficult to reflect changes in user interaction or time and space by directing the sound effects, script sound, and background music according to the storyboard organized by the director. However, it has the advantage of producing the sound effects, script sound, and background music in one track and not having to go through the coding phase. Computer-based interactive sound engineering, on the other hand, is produced in different files, including the sound effects, script sound, and background music. It can increase immersion by reflecting user interaction or time and space, but it can also suffer from noise cancelling and sound collisions. Therefore in this study, the following methods were devised and utilized to produce sound for VR contents called "A Midsummer Night" so as to take advantage of each sound-making technology. First, the storyboard is analyzed according to the user's interaction. It is to analyze sound effects, script sound, and background music which is required according to user interaction. Second, the sounds are classified and analyzed as 'simultaneous sound' and 'individual sound'. Thirdly, work on interaction coding for sound effects, script sound, and background music that were produced from the simultaneous sound and individual time sound categories is done. Then, the contents are completed by applying the sound to the video. By going through the process, sound quality inhibitors such as noise cancelling can be removed while allowing sound production that fits to user interaction and time and space.
Keywords
VRsound; Sound engineering; Sound produce; Sound making; Sound Directing;
Citations & Related Records
연도 인용수 순위
  • Reference
1 W. R. Noh, "Perception' with VR in the music industry...'Holding of One Direction's first VR concert," Accessed: Apr. 8, 2019. [Online] Available: http://www.aitimes.com/news/articleView.html?idxno=48139
2 T. J. Kim, "KT will release 5G-equipped VR next year," Accessed: Apr. 16, 2014. [Online] Available: Jul. 01, 2019. [Online] Available: https://zdnet.co.kr/view/?no=20190701155323
3 J. Steuer, "Defining Virtual Reality: Dimensions Determining Telepresence," Journal of Communication, vol. 42, no. 4, pp. 73-93, Dec. 1992, doi: https://doi.org/10.1111/j.1460-2466.1992.tb00812.x.   DOI
4 Y. J. KIM, "Optical Review of VR HMD Image Immersion," M.S. thesis, Dept. Formative arts., Chung-ang Univ, Heukseok, Seoul, Republic of Korea, 2018.
5 W. E. Ha, "Space Production for Visual Guidance in VR Animation," M.S thesis, Dept. Film Arts-Animation Making, Chung-Ang University, Seoul, Republic of Korea, 2019.
6 H. N. Kim, G. M. Hong, and H. J. Lee, "Interactive virtual reality sound output method," presented at PROCEEDING OF HIC KOREA 2019, Seoul, Republic of Korea, Feb. 13-16, 2019. [Online] Available: http://www.dbpia.co.kr/journal/articleDetail?nodeId=NODE08008074
7 S. K. Kang, "Catch noise with sound... The era of noise cancellation," Accessed: Apr. 16, 2014. [Online] Available: http://dongascience.donga.com/news.php?idx=32588
8 J. H. Lee and J. Y. Kang, "Redefine from the Consumer's Perspective of VR Films," Manga Animation Research, no. 57, pp. 521-543, Dec. 2019, doi: 10.7230/KOSCAS.2019.57.521.   DOI
9 J. H. Choi, M. G. Hwang, and P. G,Kim, "Design and Implementation of a Digital Storyboard System for Efficient Video Contents Production," presented at Spring General Conference A, Republic of Korea, May 30 2008. [Online] Available: https://scienceon.kisti.re.kr/srch/selectPORSrchArticle.do?cn=NPAP08297801&dbt=NPAP
10 K. B. Park, Virtual reality: augment reality and VRML, 21C publishing, pp. 13-14, 2012.
11 Y .J. Bang and G. Y. Noh, "The Effects of Sound Presence on User Experience and Brain Activity Pattern in Digital Game," Korean Society for Journalism and Communication Studies, vol. 59, no. 3, pp. 157-182, Jun. 2015, doi: https://www.dbpia.co.kr/Journal/articleDetail?nodeId=NODE06383680
12 J. Y. Kim, "Study of storyboard template for VR vide film," presented at International conference Proceedings of the Association, Gwangju, Republic of Korea, Jun. 24-25, 2016. [Online] Available: http://www.dbpia.co.kr.proxy.cau.ac.kr/journal/articleDetail?nodeId=NODE06700268
13 H. S. Jung, H. A. Gu, S. M. Lee, S. K. Gu, S. H. Lee, and T. H. Ryu, "Changes in Resonance Frequency and Length of External Auditory Canal in Relation to Age," Korean Journal of Otolaryngology-Head and Neck Surgery, vol. 44, no. 2, pp. 144-147, Jan. 2000, doi: http://www.kjorl.org/journal/view.php?number=3159.