DOI QR코드

DOI QR Code

Study of Scene Directing with Cinemachine

  • Park, Sung-Suk (Department of Imaging, Baekseok Arts University) ;
  • Kim, Jae-Ho (Department of Imaging, Baekseok Arts University)
  • Received : 2021.08.24
  • Accepted : 2022.02.10
  • Published : 2022.03.28

Abstract

With Unity creating footage is possible by using 3D motion, 2D motion, particular, and sound. Even post-production video editing is possible by combining the footage. In particular, Cinemachine, a suite of camera tools for Unity, that greatly affects screen layout and the flow of video images, can implement most of the functions of a physical camera. Visual aesthetics can be achieved through it. However, as it is a part of a game engine. Thus, the understanding of the game engine should come first. Also doubts may arise as to how similar it is to a physical camera. Accordingly, the purpose of this study is to examine the advantages and cautions of virtual cameras in Cinemachine, and explore the potential for development by implementing storytelling directly.

Keywords

1. Introduction

Visualization of a scene means expressing a certain scene as a single shot or a single shot sequence. Therefore, visualization on a TV screen or movie can be seen as a figuration of a scene as a film image along with sound [1]. This requires high-spec hardware and optimized production pipelines and workflows. This means that a lot of time and equipment are required to make a film. It is also difficult to edit and add images. These complex processes need to be efficiently modified and simplified.

One way to do this is to use game production technology to create a film-image with real-time rendering [2]. Cinemachine in Unity allows you to work while checking animations, lights, cameras and sound in real time [3]. This production method reduces production steps, makes modifications easier, and allows a test scene to be used as a final work. It won a Technology and Engineering Emmy® Award for excellence in engineering creativity in the broadcast-quality short film “Baymax Dreams” [4]. It intends to find a way to realize video scenes with latent power in Cinemachine, supporting various virtual cameras in virtual space [5].

2. Materials and Methods

When writer and director Neill Blomkamp and VFX director Chris Harvey were making "Adam 2: The Mirror, " the midday scene was not lit as expected. Together with the lighting team, they made the scene 30 seconds long with multiple time zones, including 2 and 4 o'clock. They were able to create additional scenes in a short amount of time, and could choose the most appropriate scene, as they were able to compare the scenes while rendering in real-time in Cinemachine [6] in Unity [7].

In order to produce a scene in Cinemachine, the characteristics of virtual cameras should be understood first. For the virtual camera, the function of a physical camera was taken as a reference, and for a scene, a movie was taken as a reference in order to understand how to use the camera in the game engine and to test it. Based on it, it tries to examine the advantages and cautions of Cinemachine by examining how well the storyboard is satisfied when it is produced in Cinemachine through case production, and by organizing the workflow of scenes for direction.

3. Footage Scene Production

3.1. How to Cinemachine

3.1.1 Directing a scene with Cinemachine

, a series starting from the first film in 2001 to the ninth film in 2021, was loved with impressive and dynamic car action scenes in each work. In order to implement such a car chase scene in Cinemachine, a dynamic camera filming is required.

One thing to note before starting Cinemachine is that only one Cinemachine Brain is allowed to one scene in Unity. The scences taken by multiple virtual cameras are actually transmitted to the Cinemachine Brain, and it finally shows the received images. An interesting feature is that the Pan and Tilt techniques of an camera can be automatically implemented by applying an object to follow to 'Look At' and 'Follow' of the camera (target) in Cinemachine [8].

Three actors in have different camera angles and viewpoints depending on their personalities. That is, after setting one actor as the target of 'Look At' and 'Follow', the values of 'Aim' and 'Body' must be specified. 'Aim' specifies how to look at the target set by 'Look At'. It can be confirmed that the target's pivot comes to the center of the screen when checked with 'Composer', the most common method. Therefore, the screen layout can be changed by modifying a 'Tracked Object Offset' value. Also, when the camera is out of the target's pivot, the movement can be controlled in 'Dead Zone' or 'Soft Zone' with 'Aim' (refer to (a) in Fig. 1). 'Body' determines how to move along the 'Follow' target. '3rd Person Follow' allows a viewer to look at the target from a third-person perspective, allowing the viewer to become an observer in the film and focus on the scene refer to (b) in Fig. 1.

E1CTBR_2022_v18n1_98_f0004.png 이미지

Figure 1. (a) Directed the Fast and The Furious 8 (2017) scene with Amy and Body attributes. (b) 3rd Person Follow for (2017) Zoom Scene.

3.1.2 Camera Blending

To create a video in Unity, Timeline and Unity Recorder are used together. In particular, Timeline enables non-linear editing by showing the activation and movement of various assets as a track. Blending in the Timeline for a screen transition effect, shows the changed value of the target's movement, not the change of the video screen. When blending a walking Anim Track and a running Anim Track, the effect of transition from walking to running is obtained refer to (a) in Fig. 2.

Blending of virtual cameras changes the movement between cameras. Easing is possible for the blending change value, and the path from one camera to another can be set as a straight line or a curved line depending on the camera's 'Transitions' setting refer to (b) in Fig. 2.

E1CTBR_2022_v18n1_98_f0005.png 이미지

Figure 2. (a) Blending conversion speed (b) Blending Path Change

3.1.3 Other Virtual Camera

Tracking Shot can be taken with Dolly Camera with Track. It can add Waypoint to a track rail to extend the rail or make the rail bend smoothly like a Spline. It can make a Dolly Camera move along the Track rail by putting the Waypoint number in Path Position. Here, track rails can be placed without restrictions in the virtual space to produce various scenes refer to (a) in Fig. 3.

E1CTBR_2022_v18n1_98_f0006.png 이미지

Figure 3. (a) Directed the Fast and The Furious 8 (2017) scene with Dolly Camera with Track. (b) Directed the Fast and The Furious 8 (2017) scene with FreeLook Camera.

FreeLook Camera can be used to create a scene using Crane and Jib. The FreeLook Camera can set 3 rigs: Bottom Rig, Middle Rig, and Top Rig. Look Down and Look Up shot with Crane Up and Crane Down is possible by implementing it to rotate for each rig or go back and forth between rigs (refer to (b) in Fig. 3).

3.2 Production With Cinemachine

3.2.1 Storyboard

It is a storyboard that shoots car action while moving the camera in virtual space.

Table 1. Storyboard for Studying Video Scenes

E1CTBR_2022_v18n1_98_t0001.png 이미지

3.2.2 Virtual Camera

Table 2. This is a table. Tables should be placed in the main text near to the first time they are cited

E1CTBR_2022_v18n1_98_t0002.png 이미지

“Camera1” made a scene with Bird's-Eye View and Zoom. It was produced by descending the track from top to bottom as if shooting with a drone. A dynamic scene was created by changing the Path Position and FOV. The FOV of “Camera2” was the same as that of Camera1. For the dramatic angle, Lock To Target With World was selected as the body's Binding Mode to prevent the camera from rotating unlike the target. “Camera3” looks from the front, but the Crane Up & Look Down effect comes out because of the position of the front and rear cameras. “Camera 4” set the Body to 3rd Person Follow to make it a “Helicopter” point of view. “Camera 5” made it look at the car accident with Bird's-Eye View.

3.3 The Final Product

“Cut5” and “Cut6” were filmed at the same time by adding virtual cameras each (refer to (a) in Fig. 4).

E1CTBR_2022_v18n1_98_f0007.png 이미지

Figure 4. (a) Add Shot Timeline(Cut5, Cut6) (2017) (b) Final Timeline for final completion

It was possible to add other screens at the same time with the Final Timeline, a collection of Shot Timelines. The animation was made for 30 seconds. However, the result was 1 minute 4 seconds 12 frames (01:04:12), and the video time was added. This is the order of the production method divided into Shot Timeline and Final Timeline.

The simplified workflow of Cinemachine can be found in the following pipeline:

E1CTBR_2022_v18n1_98_f0001.png 이미지

Figure 5. Workflows of Cinemas in 3D Production Pipeline.

4. Result And Discussion

4.1 Advantages of the Cinemachine

In general, when an animation is changed, it is often necessary to rework all subsequent scenes. But in Unity, even if an intermediate scene is changed, there is no need to rework the whole. It can save a lot of time by finding and correcting only the parts to be modified and the parts related to the correction.

The camera can additionally capture the movement of the same character from multiple angles. After composing a separate Timeline for movement as needed, various camera shots can be tried by importing the Timeline into the Control Track.

Virtual cameras are composed to enable the functions of a physical camera. In Noise, camera shake can create a scene that looks like it was shot by holding a camera. By adding the Depth of Field function of Cinemachine Volume Setting to Extensions, visual beauty can be added by adjusting the depth of field. The technology required for video filming continues to develop along with the development of game engines, and the application of the advanced technology can be done immediately without large expenses.

4.2 Cautions of the Cinemachine

Rather than sensuously arranging various virtual cameras with complex camera settings, they must be arranged in a plan in advance, and the 'Look At' and 'Follow' target must be decided first in the plan. 'Look At' and 'Follow' are functions implemented to be advantageous for game production, so it is not necessary to use them in filming. In addition, virtual cameras can be added in 'Scene View' to look at the scene, which can be conveniently used to fix the camera's filming angle.

When automated with 'Look At', the camera's viewpoint becomes the target's pivot. There is a way to change the position of the pivot from the beginning or utilize the target for an invisible point of view. It would be convenient if a point of view is decided in advance and targets are created for a different point of view when making a filming plan.

The position value of a camera is changed according to the 'Follow' type of 'Body' from the 'Follow' target. Also, the position of the camera is changed with Lock To Target or Orbital Transposer, so it is inconvenient to fix it again. When using the 'Follow', it should be determined in advance to fix the position of the Camera.

The most important caution is that only one Cinemachine Brain can be added to one scene. Unexpected filming results may be obtained as each virtual camera shoots in a different location, but the final shot is taken by the Cinemachine Brain. Also, there is no problem in filming in the Timeline, but various problems may occur when recording with Unity Recorder after adding virtual cameras of various functions. In the Unity Recorder, a completely different result may be obtained when to reuse an animation track as it continues without breaking, resulting in a different camera position in the video from what was initially intended, and a changed Follow Body type of the camera.

In the case, it is better to use a video editing tool such as Adobe Premiere rather than making a final version in the Timeline by Recorder in Image Sequence for each Shot Timeline.

E1CTBR_2022_v18n1_98_f0002.png 이미지

Figure 6. Problems when recording with Unity Recorder

5. Conclusions

With Unity, it can reduce the high production cost and long production time of a video. High vedio quality is also possible with physically based lighting and materials of developing game engines. In order to produce video with developing game engines, it examined filming methods of virtual cameras in Cinemachine, and was able to confirm that various kinds of filming are possible. In addition, it was possible to increase efficiency when filming necessary scenes along with the Timeline by separating the production by function and scene, and then used it in Post-Production like the 'separation of concerns(SoC)' that is fundamental in Unity design. Therefore, when the final work was done after composing the Timeline for each animation and short Timelines of various scenes, it was possible to conveniently edit and add new scenes.

E1CTBR_2022_v18n1_98_f0003.png 이미지

Figure 7. Unity Cinemachine Workflow

However, one thing regretful is that somewhat inflexible scenes can come out with planned filming rather than sensuous film making, so it is better to use 'Look At' and 'Follow' of virtual cameras when only necessary. Of course, it can be filmed as if holding a camera in a hand by using Unity's Live Capture Package and Apple's ARKit, but sensuous film making will be possible by movie director's direct filming if the inconvenience of connection is resolved. This study examined the Cinemachine, which can shoot a virtual space without real world constraints at a dynamic angle, and can make various attempts with a camera that can be added infinitely. It is judged that it is necessary to continuously study the technologically advanced Cinemachine and find a way to apply it.

References

  1. Herbert Zettl, Applied Media Aesthetics, (7th Ed.), Cengage Learning Korea Ltd, 2016. pp. 251
  2. Charlie Fink, "Unity To Disrupt Film And TV Production", 2017. : https://www.forbes.com/sites/charliefink/2017/12/11/unity-to-disrupt-film-tv-production/?sh=4c03d05c574b
  3. Secrets of real-time film production : https://unity.com/kr/solutions/real-time-filmmaking-explained#how-it-looks
  4. Unity wins its first Technology and Engineering Emmy® Award : https://blog.unity.com/technology/unity-wins-itsfirst-technology-and-engineering-emmy-award
  5. Steven D. Katz, Film Directing: Shot by Shot, Michael Wiese Productions, 1998 pp. 16.
  6. Cinemachine version is 2.6.5. : https://unity.com/unity/features/editor/art-and-design/cinemachine
  7. Unity homepage : https://unity.com/, Unity version is 2020.3.15.f1 (LTS),
  8. Jeremy Vineyard, SETTING UP YOUR SHOTS, (2rd ed). Michael Wiese Productions, 2018, pp. 13-4, 25, 37, 45. 59, 81, 113.
  9. tubane, Unity DESIGN BIBLE, Born Digital, Inc, 2020, pp. 466-494.
  10. Lee Sang Hwa, the Game GRAPHICS : Unity Visual Techniques, Viel Books, 2020. pp.185-219.
  11. Japanese Otaku City, ZENRIN CO., LTD. Version 1.0 - November 06, 2018
  12. HQ Racing Car Model No.1203, Azenrilo, Version 1.0 - February 11, 2019
  13. Police Car & Helicopter, SICS Games, Version 1.1 - December 03, 2016
  14. Man in a Suit, Studio New Punch, Version 1.1 - November 10, 2016
  15. Pose Editor, Sator Imaging, Version 2.2 - September 17, 2020
  16. Fire Explosion VFX, Core games studio. Version 1.0 - November 24, 2015