Even before behind-the-scenes photos of Disney’s The Mandalorian got filmmakers all over the world geeking out over the technology’s potential, the concepts behind virtual production have found their way into the cinematographic process. From pre-visualisation of key action scenes during pre-production to monitoring low-poly renderings of motion capture performances on-set, the scope of virtual production in filmmaking has only begun to open up. One area of pre-visualisation that seems to be the most natural use of the technology and simultaneously confounding in the scope of it’s application is the use of LED screens to display virtual set extensions in-camera, allowing actors to deliver performances in a living, breathing environment and to break beyond the realm of 1s and 0s that would traditionally contain them.
Usually this technology is locked away in the dungeons of mega-visual effects houses, but this past month, Cutting Edge in Brisbane, in collaboration with Big Picture Technologies and ARRI Australia, allowed us into Edge/Lab, their research and development space, and hosted a live demonstration of this technology in action. Attendees, who included members of the Australian Production Designers Guild (APDG), Visual Effects Society (VES) and our own ACS, were able to get literal hands-on experience with Edge/Lab’s virtual production setup, complete with truly giant LED screens, an Unreal Engine system and the ARRI Alexa Mini LF with Signature Primes, all hooked up in synchronicity. The discussion was spearheaded by award-winning cinematographer Jason Hargreaves ACS, along with head of virtual production at Edge Labs, Tim Schultz, and national sales manager at ARRI Australia, Sean Dooley.
Hargreaves, who had recently filmed the Screen Queensland assisted short film Decommissioned on the Griffith Film School sound stage, utilised the Unreal Engine to create parallax-correct rear-projection elements for the film, instead of relying on greenscreens. Now that the space is able to use HDR, low-pixel-pitch LED screens provided by Big Picture, he believes that for certain applications, this will be a key step towards breaking down a vast amount of creative barriers.
“I think the exciting thing about this technology, from my perspective as a cinematographer, is the fact that you can kind of use the interactive lighting that is coming off the screens,” says Hargreaves. “I think that is an amazing aspect because you get the real reflections in people’s eyes, and in objects, and glass and things like that. But it also allows you to shoot in places that you potentially could never afford to get to.”
Additionally, the quick and seamless transition from a basketball court to a moving subway train demonstrates that the cost associated with allocating resources to locations at certain times of day is no longer a burden on creativity, since any natural environmental factors are entirely within your control in the studio. “This technology has the ability to streamline some production in terms of scheduling. You could potentially be shooting a desert scene in the morning, and then you could be shooting an Antarctic scene or a city scene in the afternoon. So in that aspect, it saves travel time, location costs, location fees, getting trucks there, all that sort of stuff,” he continues.
Streamlining the scheduling of productions won’t be the only time-saving measure with this kind of capture method. “Having all key collaborators on-set able to see and approve of virtual production elements will also save a great deal of what would be usually spent in post-production,” says Schultz. “I think what this type of shooting offers is the interactive lighting and being able to see everything in camera. I think that what we are trying to achieve here is to be able to do in-camera visual effects and to get what you might call ‘Final Pixel’, or to get everyone on-set to make decisions and changes and be happy with the way it looks in- camera.”
One element that Shultz strongly indicated about this ‘Final Pixel’ concept was the inherent shift from what would typically be a post-production-heavy workflow towards one with significant weight in pre-production. “Decisions that might have happened in post might have to move into pre-production, but I don’t think that’s a bad thing,” says Shultz. “What we’re going to start to do is we’re going to have cinematographers, visual effects crew, production designers and directors all working together in pre-production to work out what they need to happen on-set. Obviously, there is going to be work that needs to be done in that early stage, but I think by doing that you’re all going to work together virtually scouting and using virtual production technology, which is not just these LED walls.”
Both Hargreaves and Schultz indicate that cinematographers and production designers would need to work more closely with each other and their respective departments in pre-production. Having collaborative discussions as early as possible are key to ensuring the virtual environments are up to their technical standards in supporting their director’s vision.
“Integrating production design, real sets and real props into the virtual environment, is essential for marrying up the reality and the believability of the scene I think. So the relationship between the production design department, production designer and the cinematographer is going to become even closer with this technology,” says Hargreaves.
“If you’re going to need ten different locations to go up onto these screens, then you’re going to have to make those decisions about what those locations are, what time of day they are, who goes out and captures photogrammetry data, LiDAR scans or HDRis, and all of the technical things you then need to recreate that on the screens,” adds Schultz. “Obviously, there is going to be work that needs to be done in that early stage, but I think by doing that you’re all going to work together virtually scouting and using virtual production technology, which is not just these LED walls. It’s real time animation and rendering so you can play out the shots before you film them.”
In regards to how to turn this strategy into an actionable pipeline, Schultz replies, “We were just talking to the APDG, some of the members there, about their creating all their sets virtually before they design them physically, so there are workflows that we can start to take those assets from them, while they build the physical one, we can start building virtual ones; then we can meld those on set so that they are extending sets.”
This technology is not the be-all and end-all solution just yet. “Some of the limitations that we are looking at currently with LED screens is obviously moiré; focusing close to the screen, the resolution that you’re getting out of screens looks great when there’s a soft focus to it, but when you actually pull focus to the actual screens you start to see the actual diodes themselves, which creates moiré patterns.” Schultz continues, “You’ve also got the size of screens, and how big you need to build them if you want to get quite wide shots with sweeping camera moves. So yeah, you tend not to be focusing on the actual screens themselves, mid-shots and close-ups work really well, while if you want to do big, establishing wide shots, the screens don’t really have the resolution and the size really, to be able to do those.”
Due to border closures between Queensland and New South Wales as a response to the Covid-19 pandemic, Big Picture’s LED screens were unable to be delivered in time to shoot Decommissioned. Hargreaves opted to shoot using the Unreal Engine but projected through the gaffer’s 20×12 silk. This flexibility allows the technology to be scaled up or down based on the production’s budget and scope.
But the screens are only one part of the equation. The studio also needs a camera to capture all the content and sell the believability of the environment. For this demonstration, Cutting Edge had partnered with ARRI Australia to showcase how well the large-format sensor, paired with Signature Primes, can fool even a room of onlookers into believing what they see on the live-feed straight from camera is the real deal.
“I think we have a natural advantage in that the large format cameras that we have, have a lower resolution than other cameras on the market,” says Dooley when asked about how the Alexa LF Mini’s lower pixel density actually works to this technology’s advantage. “So unlike having a 6K or an 8K large format camera, we have a 4.5K camera, which clearly meets all the delivery guidelines for Netflix, but we have larger pixels which means that you can be closer to the screen without seeing moiré. I think the large format cameras are naturally suited to this type of production. They’re already the ones being chosen for all the big shows like Thor and The Mandalorian who are using this technology, and it’s a lovely position to be in where the tool that cinematographers are most comfortable with and is already chosen suits this technology so well, and so we don’t have to have a massive change in the industry.”
Lens engineering has increasingly become more important with the advancement of LED technology, and focus falloff is one of the key elements one must consider when choosing the right lenses for shooting against an LED volume. “One of the things we brought tonight are the Signature Primes, and we’re about to release a bunch of tests that look into how focus fall-off is different between different brands of lenses,” Dooley explains. “You might think that if you have a 50mm lens at a certain focus distance, say six feet, that the depth-of-field would be the same, or rather the focus fall-off into the out-of-focus areas would be the same. But it’s really not, and one of the ways that the Signature Primes were designed was to accentuate a really steep fall-off of that sharpness. You’ll have an area that’s sharp, and then it very quickly becomes out of focus, and that’s really great for LED volumes where you need to throw the background out of focus in order to minimise moiré problems.”
Despite the uncertainty of how far this technology can go, and what limitations can and cannot be easily overcome, the one element that shone through from all three perspectives was how this technology will benefit creative expression.
“We’ve been a part of quite a few demos now and I think the thing that people take away the most from it is the realism within the space,” says Dooley; “Especially in terms of an actor’s performance, because they can really kind of see what environment they are reacting with.”
“What it allows for is a collaboration that probably has not happened before because you need to get it all done in camera and that’s going to be the interesting thing: how we all work together, to collaborate, to make it look good,” adds Schultz.
All in all, this technology will only continue to grow and undergo constant refinement as it makes its way to becoming an integral part of the filmmaking process. As for when we can expect to see our own dedicated LED Volume in Queensland: “I think it’s only a matter of time before someone sets up a stage here in Queensland,” says Hargreaves. “It really is at the pointy end of technology and innovation. It is going to be a part of every filmmaking process, I believe, in the future. A film will shoot one or two scenes on the screens at some point, maybe not the whole film, but it’ll become integrated into all films at some point, I reckon.”
Kevin Nguyen is a cinematographer and editor of the ACS Queensland newsletter.
Alex Shingles is a cinematographer and secretary of ACS Queensland.