DOI QR코드

DOI QR Code

Developing Virtual Tour Content for the Inside and Outside of a Building using Drones and Matterport

  • Received : 2022.04.28
  • Accepted : 2022.09.28
  • Published : 2022.09.28

Abstract

The global impact of the Covid-19 pandemic on education has resulted in the near-complete closure of schools, early childhood education and care (ECEC) facilities, universities, and colleges. To help the educational system with social distancing during this pandemic, in this paper the creation of a simple 3D virtual tour will be of a great contribution. This web cyber tour will be program with JavaScript programming language. The development of this web cyber tour is to help the students and staffs to have access to the university infrastructure at a faraway distance during this difficult moment of the pandemic. The drone and matterport are the two devices used in the realization of this website tour. As a result, Users will be able to view a 3D model of the university building (drone) as well as a real-time tour of its inside (matterport) before uploading the model for real-time display by the help of this website tour. Since the users can enjoy the 3D model of the university infrastructure with all angles at a far distance through the website, it will solve the problem of Covid-19 infection in the university. It will also provide students who cannot be present on-site, with detailed information about the campus.

Keywords

1. Introduction

The Covid-19 pandemic has had a global impact on education, resulting in the near-complete closure of schools, early childhood education and care (ECEC) facilities, universities, and colleges. To combat the spread of Covid-19, most countries agreed to temporarily close educational institutions. [1] Since the 12th of January 2021, over 825 million students have been affected by school closures in reaction to the epidemic.

Among the most important challenges created by COVID-19 is how to adapt a system of education built around physical schools. At its peak, more than 188 countries, encompassing around 91% of enrolled learners worldwide, closed their schools to try to contain the spread of the virus. School closures have a very real impact on all students, but especially on the most vulnerable ones who are more likely to face additional barriers. In reaction to school closures, UNESCO advocated for the adoption of distant learning programs as well as open educational tools and platforms that schools and teachers can utilize to reach out to students remotely and minimize disruption to education. In this present work, we propose to create a web cyber tour to assist the school in reducing the risk of infection [2]. Recently, 360° panorama technologies have been used to create immersive videos and pictures of real scenes [3-5]. These technologies allow you to acquire a circular fisheye view of your environment and are cheaper and easier to use since no specific technical preparation is required [6,7]. Matterport is a new 3D virtual platform which can play an important role in assisting schools to reopen securely as soon as local limitations are lifted.

Extended Reality technologies have already been successfully applied as methodological tools in other scientific disciplines, such as neuroscience [8], psychology [9], education [10], medicine [11], and human resources [12]. It is therefore not surprising that marketing researchers are exhibiting interest in virtual reality (VR) as a new e-commerce marketing channel with tremendous interaction capacity and completely novel contents that, up until this point, have been unavailable to marketing scholars and industry.

The aim here is to build a simple cyber web tour with two click button functions, one to view 3D interior of the school with all the detail information and the other one to view the 3D exterior part of the school, which would be helpful in education during this pandemic. This web cyber tour will be program with JavaScript programming language. The development of this web cyber tour is to help the students and staffs to have access to the university infrastructure at a faraway distance during this difficult moment of the pandemic.

Inje University is the place chosen to carry on this experiment because, it is the university that we study in and we could easily have access to. The work consists of three parts as follows:

● The drone (for the development of the exterior 3D view of the school).

●  The matterport (for the development interior 3D view of the school).

●  The web cyber tour (the platform that will be used by the user to view the above mention 3D views).

2. Theoretical background

2.1 The impact of Covid-19 toward education

The World Health Organization (WHO) confirmed that this outbreak afflicted people all across the world [13]. Everything in people's life has changed as a result of the pandemic that is currently sweeping the globe. This pandemic issue is wreaking havoc on all educational sectors. Many people have lost their employment, and some pupils are unable to attend school.

Because pupils cannot have face-to-face classes due to the new norm of life and movement controlled order, education has been hampered. This condition has a negative impact on education and institutions, as well as student achievement. That is, a lack of student-teacher interaction has also led to students becoming passionate about the integrity of their work. Many institutions turn to commercial agencies to handle exam proctoring [14],but worries about student privacy and the transmission of student mental health arose almost immediately [15].

2.2 The potential of 360◦ photography and virtual tour

Nowadays, the advancement of photography technology is impressive. The majority of software and hardware were created concurrently with the development of technology. There are numerous imaging techniques that have been developed. The term "virtual reality" (VR) was coined in the 1960s. Virtual reality photography (VRP) or 360◦ photography is a photographic technique that captures a 360◦ image with a borderless and seamless quality [16].

Virtual browse the web base, introduce or market your area, place, or space using 360◦ photographic photos. Commercialization of photography via an online virtual environment allows marketers to participate in r will be available to viewers: Use 360◦ pictures to create an interactive environment. Viewers see themselves pointing in all directions and locations in space as a result of a virtual tour. A virtual tour is a website that allows people to view a location virtually. The virtual reality is "a computer-generated three-dimensional simulation image that appears real or a physical way by a person utilizing special electrical equipment." The following digital tour can make a virtual tour out of a sequence of videos or still photographs in a variety of ways. Not computer-generated photos or films like in virtual reality games, but real-world visuals of places that people cannot or do not intend to visit. Also 360° videos are common within the 360° industry. Basic principle is the same as with 360° photography [17].

Matterport capture spaces and create their truly interactive 3D model. It is the best option for 3D virtual tour because it has some 3D Features that it produces automatically. It produces automatically the floor plan and the Dollhouse view which is explained more in detail further. Both drone and Matterport generate point cloud data so technically, the two data sets can be mesh to produce an 'outer skin' upon the Matterport dolls house. Matterport camera is a tripod-mounted 3D camera that uses Prime Sense 3D mapping technology. The same as used for example in Xbox gaming consoles Kinect motion sensing input device. The camera system rotates in place 360° while scanning and transfers the data via Wi-Fi to the Capture app on an iPad in 30 seconds. The distance between scanning locations are usually within one to three meters, and the Capture app stitches transferred scans together automatically. Captured spaces are uploaded to Matterport’s cloud service for more complete and detailed post-processing [18].

Matterport is much more than a panoramic scan; it allows users to capture and connect areas to creat completely interactive spatial 3D models. Viewers can see what they desire rather than what the productio firm has put together when they use our 3D Virtual Tour services. Instead of flying over the top of a roof an into a doorway or window, you fly over the same point of that roof and now the person watching the video ca look directly down, getting a sense of floor plan, space, staircases, room placement, and a variety of other thing that will be missed during a drone shoot.

There are many cyber web tour existing already but must of them are paranormal (image) and these images can be view only from one angle which is the front or the top view. How can these cyber web tour been modified so that the users can view the images from different angles and also how this web cyber tour help during the pandemic is what would be developed in this work

3. Content of the design development

This project was built in 6 months during the internship at the company specified in the 3D mapping section. This project was carried out by three people in total, two of whom were drone pilots and one software programmer. The methods used to build the web project will be explained step by step in this paragraph.

Two tools including drone and matterport are utilized in other to operate the process of realization of the website tour. The process consists of three technical stages. The first stage is using a drone to build the exterior 3D model of the university building. The next stage is utilizing the matterport to construct the interior 3D model of the university building. Then the last stage is to link both of them in the website (prototype). The particular software used is Bentley ContextCapture. It enables the creation of the 3Dmodels of the infrastructure, and the web cyber tour will be in a program with JavaScript programming language.

Figure 1. Flowchart a design process of creating a model of a cyber tour

3.1 Stage 1 using a drone to build the exterior 3D model of the university building.

Here is to create the exterior 3D view of the university by with the help of the drone as a tool Figure 2.

Figure 2. The Drone in it off mode

The mission of flight must be configured before launching the drone. Before starting taking images, It is important to determine and set certain parameters in the mission planner application [19]-[20]. The main hardware used in this study includes DJI Phantom 4 drones and personal computers (PC) for data processing. DJI Phantom 4 has advantages in a wider range of areas with adequate power support. The drone is also supported by a high-quality camera with dreadlocks for image stability so it is suitable for use in mapping activities .The min requirement for a PC is to have at least 12 GB of RAM, an 8th-gen i5/i7 CPU core or the equivalent AMD Ryzen processor or higher, an internal hard drive for 20 GB, Onboard Graphics similar to a desktop with 2–4 GB dedicated VRAM and 15-inch monitor or larger 1920 × 1080 (FHD). The software used is Bentley ContextCapture ver. 2.93. ContextCapture enables you to use ordinary photographs to cost effectively produce 3D models of the most challenging existing conditions for every infrastructure project. For additional accuracy, adding point clouds from laser scans results in fine details, sharp edges, and geometric precision.

Figure 3. Drone method flowchart

Figure 4.Transformation area selections

The red circle points shown in figure 4 (a) represent the point at which the camera was taken (camera point). When the fly is completed the pictures are collected and processed in a program called Bentley. This Bentley application is the one that help enables you to use ordinary photographs to cost effectively produce 3D models. The 3MX format was used for the proposed 3D mesh formats in this work. It is an open format that we propose in order to facilitate the distribution of ContextCapture data. It can be used for web publishing by using our free ContextCapture Web Viewer, and can be used to publish or embed 3D models in your website. Interoperability with other Bentley Systems products, such as ContextCapture Web Viewer and MicroStation. This 3D model view is the end result that will help students to have an idea on how the school building looks like.

Figure 5. Results of the 3D building View: (a) Top view of the building; (b) Side view of the building.

3.2 Stage 2 is utilizing the matterport to construct the interior 3D model of the university building.

Figure 6. Matterport device

Here is to create the interior 3D view of the university by with the help of the matterport as a tool. This section presents the matterport as the latest and the most popular web-based online software for the virtual tour.

Figure 7. Matterport method flowchart

The Pro2 Matterport was the device used here for this experiment. For the specification, it has a maximum operating distance of 4.5 m. The important parameter is a resolution, which for the camera equals to 10 points per degree or 3600 points at the equator; and 1800 points at meridian, which in total 4 million points per pano A typical field of view equals to 360° (horizontal) x 300° (vertical). A more detailed description can be found in [21]-[22].

For the photography it has an output Pano Pixels: 134.2 MP, equirectangular export format: images up to 8092px X 4552px; lens: 4K full glass, White balancing: automatic full-model, 360° (left-right) X 300° (vertical) field of view.

For the data distribution it uses WiFi to transfer data from camera to iOS device through the capture app and WiFi 802.11 n/ac 5 Ghz. The Battery can scan for 8 hours on one charge with a 4.5-hour charge time and with a GPS included. Before using the device, the matterport have to be setup and calibrated. After the calibration that matterport have to be move and position in different areas of the building so that all the angles of the building can be captured by the use of the 360◦ camera.

Figure 8. Camera point position

When all the pictures were taken, it had to be uploaded into the cloud. The cloud here is the service provided the matterport website. This cloud helps to join all the pictures to form a complete VR (virtual reality) view. Once the matterport Cloud creates your digital twin, you can edit your space in matterport Workshop to customize it, add additional details, and share. The cloud will create the 3D model which would be interior view. The end results obtained with matterport are well described step by step below.

Figure 9. The result image of the virtual tour in web page: (a) The main image of the web page; (b) Menu tool found on the bottom left corner of the (a).

By one click on the white circle on the floor in (a), it would help the user to move around the building. To navigate into the building, follow the white circle by clicking on them. The menu bar (b) found on the bottom left corner in (a) are important tools for this application.

Table 1. This table describe the tools incorporated in (b) above.

4. Discussion

The website in Figure 10 will help us to display both of the 3D models. That is, the end results of the drone and matterport will be viewed here in this website. HTML and JavaScript was used to create this web cyber tour. HTML which stands for Hypertext Markup Language is the predominant markup language for web page, a building block of web pages. A web browser reads HTML documents and composes them into visual or audio web pages. Here HTML was used to create the design interface, that is the buttons and frame. JavaScript is an HTML-compliant language that permits lines of code to be pasted directly into Web pages. JavaScript is not the same as Java scripts (i.e., applets). Java applets are programs that depend on HTML for their distribution but that have been designed and precompiled using specific software (e.g., JDK [23]; an introduction to Java can be found in [24]). Java is a complete programming language that can be used for any Web-based application, including on-line experiments, lab demonstrations, and real-time simulations. JavaScript is not as powerful as Java.

Figure 10. The website display interface screen

What JavaScript essentially provides is the possibility of designing documents that are updated automatically (e.g., color of the background, content of a form) or that react to user actions (e.g., mouse movement, button clicking). Because of this advantage in this web tour the user can move the mouse to see the 360◦ view of the various part of the school including its entire interior, which is an important aspect for the users.

This website contains two buttons as shown on Figure 10 above. With one click on the green button the user can have access to the interior 3D view of the school and with a click on the blue button the user can have access to the exterior 3D view of the school. This website is simple and users friendly.

The major difference between 2D and 3D modeling is that 3D modeling adds a third dimension. This means that 3D models contain more information than 2D models. They represent the finished site as it will look in real life. 2D models, on the other hand, provide valuable information, but viewers are left to imagine what the final product will look like. 3D models can contain a wide range of information types and can be used for grading, site layout and other purposes, in addition to the uses of 2D modeling. Adapting the 3D modeling in this cyber web tour is the best way to modify it so that the users can view the images from different angles and also help during the pandemic.

5. Conclusions

The aim of this work was to build a simple cyber web tour with two click button functions, one to view 3D interior of the school with all the detail information and the other one to view the 3D exterior part of the school which would be helpful in education during this pandemic. The development of this web cyber tour is to help the students and staffs to have access to the university infrastructure at a faraway distance during this difficult moment of the pandemic.

Many cyber web tour were existing already but must of them were paranormal (image) and these images could be view only from one angle which was from the top. How to modify the cyber web tour to give the users a better 3D model where they could enjoy the infrastructure by having all the detail information at a far distance was what this work developed.

We found that, adapting the 3D modeling in this cyber web tour was the best way to modify it so that the users can view the images from different angles and also help during the pandemic. It is now certain that during this pandemic education can be safe because the users can enjoy their school environment at a far distance with all angle view. This distance access solve the problem of Covid-19 infection because of the main prevention of this virus is social distancing.

For the matterport experiment to be well done, people have to stay way if not they be included in to the 3D model. For this reason, it was so challenging to carry on the experiment faster because we had to stop people from passing by during the experiment. Much time have to be put in place because the matterport device has to be put in all the different angles of the build to take many photos to develop a good 3D model design. This paper proposed a workflow to construct the 3D model of the university building and its interior real-time tour and subsequently upload the model for real time display in a website.

For the future work we will try to link this end to the google application. This link will help in such a way that users can directly get into the whole design by one click. Also the advantage here is that the users can access this link directly from google earth. We recommend the educational system to fully utilize the matterport digital platform and drone as a new approach in the study system because of the advance technology through 3D virtual tour. The matterport help the students to have an overview through hyper-real 3D virtual tour experience and details information for each selected item. We urge that the educational system fully employ the Matterport digital platform and drone as a new strategy in the study system.

Acknowledgments:

I would like to thank my supervisor Soo-jin Park for her consistent support and guidance during the scope of this project. Also I thank the CEO Kim Rihwan and his team for their collaborative effort during data collection in this research project. I would also like to acknowledge Prof. Seo Young Kim who helped me to organize this work.

Conflicts of Interest:

The authors declare no conflict of interest.

References

  1. U. E. Samuel, I. P. Abner, V. Inim, and A. E. Jack, "SARS-CoV-2 Pandemic on the Nigerian Educational System," International Journal of Management, vol. 11, no. 10, pp. 626-635, 2020, [Online]. Available: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3738925.
  2. P. Sun, X. Lu, C. Xu, W. Sun, and B. Pan, "Understanding of COVID-19 based on current evidence," J. Med. Virol., vol. 92, no. 6, pp. 548-551, 2020, doi: 10.1002/jmv.25722.
  3. E. Brivio, S. Serino, E. Negro Cousa, A. Zini, G. Riva, and G. De Leo, "Virtual reality and 360° panorama technology: a media comparison to study changes in sense of presence, anxiety, and positive emotions," Virtual Real., vol. 25, no. 2, pp. 303-311, 2021, doi: 10.1007/s10055-020-00453-7.
  4. B. Shuai, Z. Zuo, B. Wang, and G. Wang, "Scene Segmentation with DAG-Recurrent Neural Networks," IEEE Trans. Pattern Anal. Mach. Intell, vol. 40, no. 6, pp. 1480-1493, 2018, doi: 10.1109/TPAMI.2017.2712691.
  5. J. Alvarez, R. Rodriguez, and M. Martinez, "Work in progress - Use of immersive videos of virtual reality in education," 2020, doi: 10.1109/EDUNINE48860.2020.9149498.
  6. M. Irene, T. Zara, G. M. M. Ana, B. Filipa, V. Helena, A. V. Pedro, and C. M. Joao, "Evidence for the long-term contribution of upstream agricultural practices and land use on the biodiversity and function of an estuary and adjacent coastal areas," Front. Mar. Sci., 2016, doi: 10.3389/conf.fmars.2016.04.00103.
  7. L. Vinet and A. Zhedanov, "A 'missing' family of classical orthogonal polynomials," J. Phys. A Math. Theor., vol. 44, no. 8, 2011, doi: 10.1088/1751-8113/44/8/085201.
  8. J. Fox, D. Arena, and J. N. Bailenson, "Virtual Reality: A Survival Guide for the Social Scientist," J. Media Psychol., 2009, doi: 10.1027/1864-1105.21.3.95.
  9. W. P. Teo, M. Muthalib, S. Yamin, A. Hendy, K. Bramstedt, E. Kotsopoulos, S. Perrey, and H. Ayaz, "Does a combination of virtual reality, neuromodulation and neuroimaging provide a comprehensive platform for neurorehabilitation? - A narrative review of the literature," Frontiers in Human Neuroscience. 2016, doi: 10.3389/fnhum.2016.00284.
  10. J. T. Bruer, "Building bridges in neuroeducation," in The Educated Brain: Essays in Neuroeducation, 2008.
  11. I. A. C. Giglioli, G. Pravettoni, D. L. S. Martin, E. Parra, and M. A. Raya, "A novel integrating virtual reality approach for the assessment of the attachment behavioral system," Front. Psychol., 2017, doi: 10.3389/fpsyg.2017.00959.
  12. M. Alcaniz, E. Parra, and I. A. C. Giglioli, "Virtual reality as an emerging methodology for leadership assessment and training," Front. Psychol., 2018, doi: 10.3389/fpsyg.2018.01658.
  13. S. Bhandari, A. S. Shaktawat, B. Patel, A. Dube, S. Kakkar, A. Tak, J. Gupta, and G. Rankawat, "The sequel to COVID-19: the antithesis to life," J. Ideas Heal., vol. 3, no. Special1, pp. 205-212, 2020, doi: 10.47108/jidhealth.vol3.issspecial1.69.
  14. D. Ramachandran, D. Subramanian, and I. John Kisoka, "Effectiveness of Internal Audit in Tanzanian Commercial Banks," Br. J. Arts Soc. Sci., vol. 8, no. I, pp. 2046-9578, 2012, [Online]. Available: http://www.bjournal.co.uk/BJASS.aspx.
  15. C. G. Vidal, G. Sanjuan, E.M. Garcia, P. P. Alcalde, N. G. pouton, M. Chumbita, and C. Pitart, et al "Incidence of co-infections and superinfections in hospitalized patients with COVID-19: a retrospective cohort study," Clin. Microbiol. Infect., vol. 27, no. 1, pp. 83-88, 2020. doi: 10.1016/j.cmi.2020.07.041.
  16. C. James, "International Internship Opportunities for Emerging Design Professionals," in Inted2012: International Technology, Education and Development Conference, 2012, pp. 4894-4901.
  17. J.-H. Lee, "The new explore of the animated content using OculusVR - Focusing on the VR platform and killer content -," Cartoon Animat. Stud., vol. 45, pp. 197-214, 2016, doi: 10.7230/koscas.2016.45.197.
  18. V. V. Lehtola, H. Kaartinen, A. Nuchter, R. Kaijaluoto, A. Kukko, P. Litkey, E. Honkavaara, T. Rosnell, M. T. Vaaja, J. P. Virtanen, M. Kukela, A. E. Lssaoui, L. Zhu, A. Jaakkola, and J. "Comparison of the selected state-ofthe-art 3D indoor scanning and point cloud generation methods," Remote Sens., vol. 9, no. 8, 2017, doi:10.3390/rs9080796.
  19. I. Cruz-Aceves, J. G. Avina-Cervantes, J. M. Lopez-Hernandez, and S. E. Gonzalez-Reyna, "Multiple active contours driven by particle swarm optimization for cardiac medical image segmentation," Comput. Math. Methods Med., vol. 2013, 2013, doi: 10.1155/2013/132953.
  20. M. Mangiameli and G. Mussumeci, "Gis approach for preventive evaluation of roads loss of efficiency in hydrogeological emergencies," Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. - ISPRS Arch., vol. 40, no. 5W3, pp. 79-87, 2013, doi: 10.5194/isprsarchives-XL-5-W3-79-2013.
  21. M. Pulcrano, S. Scandurra, G. Minin, and A. Di Luggo, "3D CAMERAS ACQUISITIONS for the DOCUMENTATION of CULTURAL HERITAGE," in ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 2019, vol. 42, no. 2/W9, pp. 639-646, doi: 10.5194/isprs-archives-XLII-2-W9-639-2019.
  22. E. Lachat, H. Macher, T. Landes, and P. Grussenmeyer, "Assessment of the accuracy of 3D models obtained with DSLR camera and Kinect v2," in Videometrics, Range Imaging, and Applications XIII, 2015, vol. 9528, p. 95280G, doi: 10.1117/12.2184866.
  23. S. F. Husaini, "Using the Java Native Interface," XRDS Crossroads, ACM Mag. Students, vol. 4, no. 2, pp. 18-23, 1997, doi: 10.1145/332100.332105.
  24. N. E. Briggs and C. F. Sheu, "Using Java in introductory statistics," Behav. Res. Methods, Instruments, Comput., vol. 30, no. 2, pp. 246-249, 1998, doi: 10.3758/BF03200651.