Virtual Production | |
---|---|
FILMS Chicken Run 2 Wallace & Gromit Farmageddon Early Man SHORT FORM UN Project Everyone Staves Winter Trees Rays Big Idea INTERACTIVE Virtual Production Unreal Fellowship Game Production CRAFT School Model Burrow House The Castle |
After making a short film and completing the fellowship of Virtual Production I was asked to create a new department tool at Aardman. Chicken Run 2 was starting production and the world needed to be bigger than the studio space allowed, so some CG set extensions were needed. To enable the director to visualise the shots I created a totally bespoke Unreal Engine real time tracking system. As this was a new technology for the studio and knowing that it had to be used on up to 15 shooting units at a time I knew the lower the cost the more likely it would be adopted. We used a Vive tracker mounted on the camera running on an Unreal software system that I created entirely from scratch. We had lots of specific requirements for using VP in stop motion that were not catered for from existing systems. We use Canon digital stills cameras, it had to run on an existing PC setup using Dragon Frame to capture the images. Often due to the tight space for the camera to fit into a stop motion set we flip the camera upside down so it can get lower to the ground, this also had to be added to he Unreal blueprint. Each lens used in production had to be added in case the director asked to try a different lens during initial composure of the shot on the floor. We use an HTC Vive kit and Steam VR’s portal to calibrate the environment. A small part of the kit, a tracker, is attached to the camera body. This tracker is linked live to the Cinecamera, we are then able to move the physical camera, and in real time the digital camera will match all movements made. Once all the tracking and technical requirements were solved I had to create the assets. We scanned several of our physical props and sets with an Artec Spider Scanner as well as making a lot of assets in Maya from designs and measurements taken while in the studio. The scanned captured data is then processed in Houdini and Maya, before then being rigged and posed. This virtual data can then be posed in Pre-Vis. The directors and team work to achieve the desired look including focal length, camera angles, virtual set dressing, and actor blocking. When approved the information is processed and transferred to Unreal, where the Virtual Production floor crew access it. Our live image from the Canon camera is hosted in DragonFrame, ported into Unreal via a live feed output. In Unreal I created a composite consisting of the live image and the CG backdrop – which we can see by keying our blue/green screen. Virtual Production was often used for set extension in several of our sequences on Chicken Run: Dawn of the Nugget. Often the environment in mind was too big to physically be built, such as the interior of Funland Farms. However, with the use of this technology, a constantly developing and evolving feature at Aardman, we weren’t met with such restraints. VP proved to be an asset on the floor due to its flexibility. Whether we used it for live tracking on set visits, real time compositing to demonstrate the vision of the final composition, or even moving assets around the space like a virtual set dress. After signing off any changes on the floor and after a unit visit, the data is exported back to VFX, who then publish any floor changes back into Maya for camera and set changes. Next up is CGI animation and any final dressing changes. Then the finalised data is published and ready for lighting and rendering along with the stop motion plats captured for the shot. |