Metaplates
Advanced BG Plates for VP & VFX
With the rise of Virtual Production, notably LED volumes, we're starting to see a bigger lean towards plate shot material from a cost and photo-real perspective.
However, plates have their limitations, the most significant they are 2D.
It won't be long until Gaussian Splats are production-ready, but in the meantime, we'll try some traditional methods to create a real-time environment.
Back in 2008, 2.5D and projection were a big thing. Using today's real-time render technology, we can utilise these old techniques to achieve more with 2D plates.
For this experiment, we'll use two cameras from the same nodal point and capture two datasets.
The first capture will use a Canon 5DS R + EF Lens to capture 819 images from 17 positions and seven exposures for each. This will be stitched into a 22-stop, 79k x 39k equirectangular image, also known as a Lat Long. This will create a giant spherical world view.
The second capture will use a Blackmagic 12K URSA with an EF Lens to capture four 10-second tiled clips, from top right to bottom left. This will be stitched together into one 10-second 20k x 11k cine plate. This will provide a layer of movement that will nest inside the sphere.

Stills camera on a robotic head

Cine camera on a robotic head
Below are the outputs from the two captures, a 79k x 39k HDRi LatLong & a 20k Rectilinear image

HDRi still capturing the highlights

The same HDRi showing the low-lights

Cineplate with 15 stops dynamic range
Colour Matching
It's important that both outputs from each camera match in colour. Quite a bit of time was spent creating a workflow to match the two using ACES.
To simplify this process, both cameras were shot with the same lens and matched in ISO and f-stop. Each set of images was processed from OCFs into ACES 2065-1 EXRs and then stitched together.
The ColorChecker image to the right shows split patches for colour comparison.
On the left side is the Canon, and on the right is the Blackmagic. The results are close but not perfect.

The colour wedges are split: Canon on the right, BM on the left.
Alignment
The first objective to make the scene 2.5D is to align the Cine plate inside the Spherical plate so that the vantage points match to create a seamless field of view.
Once in position, the cine plate is graded to match its surrounding scene. To create the fake parallax, some matte work is needed to remove portions of the cine plate, allowing parts of the scene behind to be seen through.
The matte work will be produced at a later date.


Above: you can see the quad view with the image plane nested inside the sphere.
Left: provides some moving context to see how the frame is nested.
Results
A milestone for this experiment was to see how well two large assets from different capture sources and formats can fit together, which the steps above explain.
Below you can see what the virtual camera can see inside the comp at two focal lengths. The red dashed box shows the edges of the nested cine plate.
The results are promising and offer a good alternative to a complex, expensive game-engine environment and the hardware needed to drive it.
The scene is ready for the introduction of Gaussian Splat technology, which could provide a real-time 3D asset at the centre of the scene and additional CG objects which can be lit by the same surrounding HDRi image.
For our next experiment, we'll use a different location that can demonstrate how all the different visual technologies can truly work together.

