102 with 73 posters participating
- This is the LED set used for The Mandalorian.
- The interior of the set, as performers might see it.
- Here’s the set with a large prop in itsometimes only the LED set was used, and sometimes it was used in tandem with practical elements.
- The camera captures both physical objects and the virtual background directly.
- This visualization shows virtual set elements alongside real actors and how they appear in a shot.
- Scene assets can be removed or moved easily.
- As seen in this screen capture from a promotional video by Epic Games, directors and DPs can do location scouting in VR.
- Also from that video: you can see that the frustum for the camera is displayed on the panel in real-time in sync with camera orientation.
- For some shots, the scenes and assets used in shooting may not be final. In those cases, a green screen can be displayed for use in post-production.
- Lighting can be changed in real-time using an iPad. In many cases, the lights from the LEDs as configured here act as the final scene lighting.
- Presets can also be set and near-instantly switched between for efficient shoots.
- A stage director handles many of these changes on-set.
Industrial Light and Magic has published a behind-the-scenes video on the production of Disney+’s The Mandalorian that gives an illuminating look at two of the biggest, high-tech trends in film and TV production: LED sets, and using game engines to create scenes. The video explains a major shift in virtual filmmaking that is unknown to most viewers.It has historically been impractical to achieve the production values seen in The Mandalorian in TV series, because the kind of visual effects work necessary simply takes more time than a TV production schedule allows. Generally, special effects-driven productions shoot scenes with actors and props in front of a green screen, and then teams add in the background environments and any computer-generated objects in a lengthy post-production period.
That’s not how things worked on The Mandalorian. Executive Producer Jon Favreau, Industrial Light and Magic, and game engine-maker Epic Games collaborated to use the Unreal Engine to pre-render scenes then display them as parallax images on giant LED walls and an LED ceiling in a 21-by-75-feet digital set. It’s part of a lineage of production techniques and tools developed by Favreau’s teams called StageCraft. This approach offered numerous benefits.
The ILM video demonstrating the tech used to shoot The Mandalorian.
First off, actors could see virtual objects and environments around them in real-time, including horizon lines. This solves a long-standing problem with VFX-heavy productions and actors’ difficulties getting in the scene or responding realistically to objects or sights in it. Assets could be changed on the fly as requested by the director or director of photography (DP). So, if the director decides that a certain building in the background is messing with the framing or otherwise detracting from their vision, they could request that the stage operator move the building in just a few seconds.
Additionally to that point, real-time lighting is provided by the LED panels, so lighting can be changed with a simple iPad interface without requiring long periods of manually moving physical lights. (This task takes an enormous amount of time out of each day during traditional shoots.) Entire sets can be unloaded and replaced with totally new ones in a matter of minutes, provided there is also not a heavy use of practical effects in tandemsomething completely impossible in traditional filmmaking.All of this allows the director, crew, and creatives to be more flexible in production, try different approaches, and ultimately, avoid hundreds or even thousands of hours of revisions in post-production. It even extends to pre-production: the toolkit allows scoping out shots and location scouting in virtual reality before shooting begins. And it lets producers avoid the enormous cost of shipping entire productions, crews, and casts to far-flung deserts, forests, tundras, or what have you for location shooting.
Favreau features prominently in the video, but he went into much more detail during an interview at 2019’s annual SIGGRAPH computer graphics conference. Quoted in VFX blog Befores & Afters, he said:
We got a tremendous percentage of shots that actually worked in-camera, just with the real-time renders in engine, that I didnt think Epic was going to be capable of. For certain types of shots, depending on the focal length and shooting with anamorphic lensing, theres a lot of times where it wasnt just for interactive we could see in camera, the lighting, the interactive light, the layout, the background, the horizon. We didnt have to mash things together later. Even if we had to up-res or replace them, we had the basis point and all the interactive light.
VFX were not the only benefit, he added:
For the actors, it was great because you could walk on the set, and even if its just for interactive light, you are walking into an environment where you see whats around you. Even though [the LED walls] might not hold up to the scrutiny if youre staring right at it from close up, youre still getting peripheral vision. You know where the horizon is, you feel the light on you. Youre also not setting up a lot of lights. Youre getting a lot of your interactive light off of those LED walls. To me, this is a huge breakthrough.
Engines traditionally used for game development have become bigger and bigger players in film and TV production in recent years, as part of the ongoing development of the virtual filmmaking disciplines.
A video from Epic Games that gives a more detailed look at the tools and how they can be used.
Unreal is not the only engine used for this kind of work, either. Favreau previously used competing game engine Unity for the Disney CG/live action remakes of The Jungle Book and The Lion King, and Unity has been used in other productions like Blade Runner 2049 as well. Unity is often used for pre-visualization. And while it’s animation software rather than a game engine, we recently published an article on how game development tool iClone was used in the production of the Keanu Reeves film Replicas.
The Mandalorian is not the first big production to use LED sets, but some of the specifics of the implementation here, particularly features driven by Unreal, are cutting edge. This video has led to some effusive tweets and headlines from the VFX and virtual filmmaking professional communities, including the headline “You are going to flip when you see this video of how The Mandalorian was made” from the above-mentioned Befores & Afters.
These are the early days for some (not all) of the related technologies, and further developments may introduce more refinements. The benefits here aren’t so much that they produce more realistic results than traditional VFX tech and workflows; in fact, some viewers felt The Mandalorian looked a little flat in places, and some of that impression stems from limitations of this tech. But as has already been the case with The Mandalorian, tech like this will make previously impractical concepts for TV series and films much more feasible.Listing image by ILM