Now Reading
How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough

How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough

ClementObropta-avatar-2020
How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough

Star Wars is the house that visual effects built. And in the new Disney+ series The Mandalorian, that house is projected real-time onto a 270-degree LED video screen wall using game engine technology. And because of it, visual effects are likely changing forever.

The method The Mandalorian used involves shooting actors on a set dubbed “the Volume”. (On film sets, a “volume” refers to any space determined by motion-capture technology.) The actors and crew are surrounded by a video screen wall that’s 21 feet tall and 75 feet in diameter, run by 11 computers, and comprising 1,326 LED panels. This wall doesn’t just do interactive lighting — it also utilizes game engine technology to show renders of environments and effects in-camera in real-time. The Mandalorian is the first production to combine this revolutionary tech, using real-time rendering in concert with video wall in-camera set extensions and effects and virtual production workflow.

The Volume, formally called StageCraft, was created by Industrial Light & Magic and Epic Games, partnering with Fuse, Lux Machina, Profile Studios, NVIDIA, and ARRI, in collaboration with Jon Favreau’s Golem Creations. With it, Favreau and his team on The Mandalorian could literally move mountains, build Sandcrawlers on sound stages, and take actors from Iceland to Tunisia in under an hour.

What Is The Volume?

What do the Dawn of Man and King Kong have in common? Front-screen projection.

Front-screen projection combines filmed performances in the foreground with pre-filmed material projected in-camera in the background. The method’s most famous usage was in the “Dawn of Man” sequence in Stanley Kubrick’s 2001: A Space Odyssey, where the hominids were shot on a set with an African backdrop projected onto a screen behind them. This projection trick was used as early as 1933’s King Kong.

The Mandalorian (2020) – source: Disney
source: Disney+

For The Mandalorian, front-screen projection was achieved by surrounding a stage with an LED video wall and ceiling. Virtual environments were displayed on the LEDs to serve as not only background — as for the sunsets on Tatooine or the hyperspace in c*ckpit scenes — but also as set extensions. The Mos Eisley hangar, from episode five, exists in practical props and floor on set, but it also extends an additional 30 feet in each direction beyond what the set allows. Vehicles like the Jawas’ Sandcrawler or the Razor Crest, the ship that the Mandalorian (Pedro Pascal) pilots, were created using a combination of practical sets and virtual extensions: Half of the ship would be in the Volume, the other half on the wall.

Normally, on effects-heavy films and series, green- and blue-screen sound stages are deployed to photograph actors and composite the backgrounds and action in later. But such tools are expensive, grind post-production to a crawl, and burden the actors, cinematographers, and directors, who have no idea what they’re responding to in real-time. The Volume is the solution to these workflow issues.

Bringing Iceland To Los Angeles

Over 50 percent of the show’s first season was filmed in the Volume, as a way of cutting costs and adapting to a television series timetable. Not that it’s especially cheap — the first season’s budget was $100 million; for comparison, the final season of Game of Thrones had a $90 million budget. But The Mandalorian is far less expensive than a Star Wars film and about four times as long. The relatively cheaper Star Wars anthology movies Rogue One and Solo each had budgets over $200 million.

So there could be no expenses to transport cast and crew across the world multiple times for location shoots. The Volume allowed camera crews to instead bring the world to their studio. Crews traveled to locations like Chile and Iceland, and they brought thousands of photographs and 3D photographic scans back to Manhattan Beach Studios in Los Angeles, where most of the show was shot. The visual effects team also used renderings ILM had stored from previous Star Wars projects, including scans and photos of Greenland and Tunisia. This photogrammetry formed the basis of the photoreal worlds created in StageCraft.

For the director of photography Baz Idione and co-producer, co-cinematographer Greig Fraser, the latter of whom also shot Rogue One, the Volume was far more generous than a normal blue-screen sound stage. For one thing, StageCraft completely lit their sets.

How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough
source: Disney+

“Often, you’re shooting real photography elements before the backgrounds are created and you’re imagining what the interactive lighting will do — and then you have to hope that what you’ve done on set will match what happens in post much later on,” Fraser told American Cinematographer. “If the director changes the backgrounds in post, then the lighting isn’t going to match and the final shot will feel false.”

On the Volume, all of the lighting was generated by the LED walls. This meant that the virtual environment was designed with lighting in mind. Assets like the sun in the Tatooine sunset shots had to be much brighter than the surrounding sky so the light projected onto the performers and the set was seamless.

For the outdoor daytime scenes when the sun was more prominent — as in the Sandcrawler sequence in “The Child,” most of the Sorgan-based “Sanctuary,” and the battles in the concluding two episodes — the crew had to move to the backlot. LEDs can cover many bases, but they cannot match the intensity of daytime sunlight.

When shooting in the Volume, Idione liked putting a narrow band of white near the top of the LED wall to act as a strong backlight for Mando’s helmet. Any LED panel could be set to full brightness or act as a black flag — that is, to block light — whenever and wherever the cinematographers required.

The DPs could also control the location conditions. If episode directors Dave FiloniRick FamuyiwaDeborah ChowBryce Dallas Howard, and Taika Waititi needed mountains to be moved to improve the composition, the visual effects team could go into the virtual world and move them. And they could keep a virtual sunset for 10 hours. “Magic hour” could last a whole day of shooting instead of just 25 minutes, thus greatly speeding up the shoot.

Reflections And Moiré In The Volume

Using the LED walls for lighting also solved one of the biggest headaches for green- and blue-screen sets: reflections.

In Revenge of the Sith behind-the-scenes footage, George Lucas makes a big deal out of C-3PO — it’s the first film he’s shiny in. And because of that, in Anthony Daniels’ costume, you can see reflections of the green screen, the camera, and the crew, which had to be laboriously mapped out and replaced with new reflections in post-production. The Mandalorian faced the same problem, though this time, the shiny guy wasn’t just a peripheral player.

How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough
source: Disney+

With the first two episodes, when Mando’s look is meant to be scummier and more worn — and particularly in the messy mudhorn fight in episode two — the reflections aren’t big concerns. It’s episode three onward, when he gets his new Beskar armor, that Mando is transformed into a walking, gunslinging mirror. The Volume allowed for correct reflections that wouldn’t need to be modified in an editing bay, and it also permitted more nuanced highlights on the character’s armor than a green-screen set would have provided. But such luxuries require enormous time spent designing the world.

“Even if we were only shooting in one direction on a particular location, the virtual art-department would have to build a 360-degree set so we could get the interactive lighting and reflections right,” Fraser told American Cinematographer. “This was also true for practical sets that were built onstage and on the backlot — we had to build the areas that we would never see on camera because they would be reflected in the suit.”

To complete the reflective look, additional walls can be added to the Volume for 360-degree coverage. Walls could also be removed to increase access.

The LED screens, though, pose a different problem: the moiré effect, which is the wavy static you sometimes get from capturing screens on camera. The solution involved giving the LEDs a 2.84mm pixel pitch, combined with the camera lens’s low depth of field, which neutralized the moiré. With these in-camera effects, post-production on scenes shot in the Volume could focus on details instead of big-picture compositing.

Shooting In A Virtual World

As groundbreaking as the lighting effects and virtual world of the Volume is, StageCraft also features a 3D parallax view.

“Parallax” refers to perceived distance from objects from the camera’s perspective. In the Volume, that translates to a virtual environment that is constantly changing in real-time based on where the camera is pointing — an innovation courtesy of Epic Games’ Unreal Engine.

How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough
source: Disney+

In Disney Gallery: The Mandalorian, a behind-the-scenes series on Disney+, visual effects supervisor Richard Bluff explains what’s so special about using a game engine: “The [visuals] that anybody playing a video game [is] looking at are being calculated in milliseconds. So if you move right or you move left in a scene or you turn around and you see a view in an environment that you’ve never seen before, it’s happening in milliseconds. It’s real-time.”

To get it right, the production’s camera specs — 2.39:1 anamorphic Arri Alexa LF — were programmed into StageCraft, which ensured that the space the camera was seeing was rendered completely, while the rest of the wall could render a low-resolution image for lighting purposes.

“Sputniks” was fixed to the camera to track its location in real-time. These Sputniks were basically small white ping-pong balls that the motion-tracking cameras around the Volume saw and used to record the camera’s position in virtual space.

“It had positional data from the camera so you could have perspective and parallax change as the camera moved,” Idione said, “and so the content on the wall would change as the camera was changing.”

How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough
source: Disney+

The Volume upended the traditional production workflow, turning The Mandalorian’s pipeline into something closer to that of an animated film. The cinematographer, virtual art department, location scouts, lighting team, and VFX teams had to be involved with production even before principal photography began. They designed 3D virtual sets in StageCraft, which the cinematographers and art departments would then virtually scout out using VR headsets, perfecting the production design and choosing digital actors, props, textures, and skydomes.

An array of artists from ILM, Unreal, and Profile manned “Brain Bar” workstations in the studio during production. These stations were responsible for quick color correction jobs and tweaking the virtual lighting between shots.

How The Volume Was Created

Lucas had been trying to develop technology akin to StageCraft over a decade ago, in 2008, when he was investigating his own Star Wars series that never saw fruition.

ILM has been on the cutting edge of VFX for decades, and its earlier innovations contributing to StageCraft include 1991’s Hook, when a matte painting was first mapped to 3D geometry, allowing for camera parallax. But where ILM had the wherewithal, it took Favreau and a string of VFX-heavy envelope-pushing Disney films to make the Volume a reality.

This is no first rodeo, after all, for Favreau, creator, producer, head writer, and showrunner on The Mandalorian. In addition to his effects-heavy portfolio, including Elf, Zathura, and Iron Man, Favreau pushed the boundaries of virtual film production in his Lion King remake from 2019. Where his 2016 Jungle Book reimagining photographed Neel Sethi (as Mowgli) and his motion-capture costars on a blue-screen soundstage with interactive lighting, The Lion King pioneered the game engine technology that Favreau and ILM would go on to use in The Mandalorian.

How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough
source: Disney+

In The Lion King, Favreau and his team adjusted lighting, operated cameras, and shot their footage in virtual reality. They had an earlier iteration of the Volume, too, except their world was completely virtual, with no tangible set at all. Wired described the Lion King set as “a large open space in which the crew has set up dolly tracks or cranes. […] View­finders are festooned with pucks, hand-sized globs of plastic that broadcast infrared signals. Overhead on a metal truss, a matrix of 3D sensors tracks the signals and translates the viewfinders’ positions back into VR.”

On this set, behind the scenes, Favreau was experimenting with in-camera virtual parallax. His tinkering with basic 3D geometry and shooting real subjects in a virtual world would directly inform his work with StageCraft.

Carl Weathers Loves The Volume

This technology not only benefits the filmmakers but also the actors. In the Star Wars prequels’ behind-the-scenes reels, Ewan McGregor and Hayden Christensen are faced with the insurmountable task of being plopped onto an entirely green-screen set and reacting to nonexistent locations, characters, and events. Revenge of the Sith, which had 2,400 VFX shots, more than any other film in the franchise, has scenes where fully costumed Wookies are charging over blue-screen barricades into a war that isn’t there, or where Anakin and Padmé look out a window at a city that will be inserted in post. You can see that the actors are slightly lost, trying desperately to invent a world with their performances.

With The Mandalorian, that detachment wasn’t a problem. “You were in the environment,” said Carl Weathers, who plays the Bounty Hunter Guild associate Greef Karga in the series. “You don’t have to pretend anymore.”

He described how much the Volume improved VFX-heavy scenes, like the flight down Nevarro’s lava river in the season finale, which would have been nigh-impossible to shoot properly using green screen. “If you’ve got four of us in the boat, right? There are four people — if you don’t actually see the same thing, [they] have four different concepts of what’s going on.”

How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough
source: Disney+

The Volume builds that world for the actors and creates verisimilitude for the performers as much as for the crew. “Filmmaking, now, all of a sudden, is back to almost old-school,” said Werner Herzog, who guest-stars as “the Client” in the show. “Technology becomes invisible, and that’s a great thing.”

“It put me back in a set,” said Famuyiwa, one of the episode directors. “It put me back where the rules are what you understand.”

What also helps is that the Volume is operating alongside plenty of practical effects, sets, and props. The Child, dubbed “Baby Yoda” by fans; Nick Nolte’s Ugnaught, Kuiil; and Waititi’s bounty hunter droid IG-11 are characters one would expect to be created virtually. (And in early discussions, The Child and IG-11 were to be entirely CG creations, with models and puppets only for lighting reference and stand-ins during the shoot.) But they’re instead characters who exist mostly in-camera.

The Limitations Of StageCraft

StageCraft is on its way to becoming industry-standard, but it will by no means be standard across the artform. It’s the future of multimillion-dollar cinema, not cinema as a whole.

And the tech still isn’t perfect — of the many limitations the technology bears, chief among them is the time spent laboring in pre-production. Every set needs to be created in full, even if only 40 percent of the set is ever shown on-screen, for the lighting to be effective. Gone is the luxury of only designing what the camera will see.

There are also technological hurdles that every animation team will bump up against, like color lighting and reflection. The lava river on Nevarro is a good example of this — the red tones of the lava are virtual, but because the LED wall is the crew’s primary lighting source, the river casts a red light on the actors. The dark tones in the set and floor for the lava river sequence helped this effect enhance the scene.

But there are plenty of scenes where a prop like a yellow post or a red canister in the virtual set would throw everything off, especially in the brighter locations like Mos Eisley. “As the LED screens are providing all the lighting onto the practical set, you end up with this red and yellow light source that is directly lighting up the sand in front of it,” Bluff told fxguide.

How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough
source: Disney+

Additionally, crew members who normally only work during principal photography — like the cinematographers and lighting teams — have to learn the technology alongside the VFX teams and start on the project as early as possible to design the environments and plan the shots ahead of time.

That world of possibility allows filmmakers to create the perfect shot virtually months before they shoot, which risks making the project seem too idealized. The cinematographers on The Mandalorian shared this fear — “I won’t always want it to be a perfect backlight, because that ends up looking fake,” Fraser said to American Cinematographer.

That introduces the challenge, then, of having to create your own limitations, which The Mandalorian never quite figured out in its first season. Many scenes — especially those with The Armorer (Emily Swallow) and fights between Mando and stormtroopers — have an artificial, overly composed aesthetic, like shots are designed to be desktop backgrounds rather than stills from a TV show.

That aesthetic owes to the Westerns and samurai films The Mandalorian is emulating: wide shots, slower-paced editing for the fights, and a very non-aggressive, locked-down camera. But that look is also informed by StageCraft’s technical limitations. Due to latency problems, the show couldn’t use styles like shaky cam or warped perspectives. “At no point will you see an 8mm fisheye lens in someone’s face,” Fraser said.

For me, looking back at The Mandalorian knowing how the show was filmed, Mando’s adventures seem smaller in scale. Scenes between two characters in a vast expanse of Tatooine desert feel more insular and less grandiose. The Volume, once you’re aware of it, has the unfortunate effect of making the production feel small.

You start to notice all the scenes that can be boiled down to two to four people sitting down and talking. That limitation makes it hard to imagine how larger-scale projects like the mainline Star Wars films, the Marvel movies, or even something like The Lord of the Rings, where characters are bounding across Middle-Earth hillscapes, could be realized on such a small stage, no matter how much possibility there is to project an infinite world on the LEDs.

How THE MANDALORIAN And ILM Created A Visual Effects Breakthrough
source: Disney+

Favreau and his team describe StageCraft like this easy-to-use technology that’s going to revolutionize moviemaking. Neither claim holds much water — the tech, for one thing, is extremely complicated. (We didn’t even get into real-time ray tracing or matching visual complexity in V-Ray post-rendering.)

For another, The Mandalorian wasn’t just a happy accident. The Walt Disney Company owns Lucasfilm as well as Industrial Light & Magic, like a magician who owns both a top hat company and a rabbit farm. ILM works with other studios frequently (No Time to Die, The SpongeBob Movie: Sponge on the Run, and Space Jam: A New Legacy are their upcoming projects outside of the House of Mouse), but I doubt the Volume, expensive as it is, would have been possible in its current state if Disney had to work with an outside VFX house.

Conclusion: The Future Of StageCraft

Besides Star Wars, StageCraft and tech like it will be popping up everywhere. LED-stage virtual production is the paradigm-shifting technology that all major studios are now experimenting with.

Weta Digital, for example, announced in July that it partnered with Avalon Studios and Streamliner to build a new virtual production studio in Wellington, New Zealand, also utilizing Epic Games’ Unreal Engine. Reshoots for James Cameron’s Avatar sequels might take place on these sets, depending on when they’re completed. Smaller production houses like companies NEP Live and Skyway Studios created their own LED stage as well.

In an uncertain time in the film industry, when production on everything from blockbusters to commercials has been thrown for a loop, studios need this technology more than ever. With it, location shoots and set builds can be reduced, crowd scenes can be eliminated, and productions can embrace social distancing and smaller crews.

The evolution of this LED virtual production technology will be interesting to track in the coming years. StageCraft will return to Disney+ in the next season of The Mandalorian, still slated to release October 2020. And the new Obi-Wan Kenobi series created by Chow, one of The Mandalorian‘s directors, is still probably years away, but that series will also heavily utilize the technology.

The VFX industry has come a long way from King Kong and 2001, and StageCraft has made the galaxy far, far away more accessible than ever before. In many ways, it’s the game-changing tech that Lucas had always been chasing with the Star Wars franchise, and big-budget filmmaking will likely never be the same again.

What do you think about ILM StageCraft and The Mandalorian? Do you think this technology is as paradigm-shifting as it seems? Comment below with your thoughts.


Watch The Mandalorian

 

Does content like this matter to you?


Become a Member and support film journalism. Unlock access to all of Film Inquiry`s great articles. Join a community of like-minded readers who are passionate about cinema - get access to our private members Network, give back to independent filmmakers, and more.

Join now!

Scroll To Top