The Mandalorian Bounty Hunter (Pedro Pascal) rescues the Child — popularly described as "baby Yoda."
Feature

The Mandalorian: This Is the Way

Cinematographers Greig Fraser, ASC, ACS and Barry “Baz” Idoine and showrunner Jon Favreau employ new technologies to frame this Disney Plus Star Wars series.

Jay Holben

Unit photography by François Duhamel, SMPSP, and Melinda Sue Gordon, SMPSP, courtesy of Lucasfilm, Ltd.


This article is an expanded version of the story that appears in our February, 2020 print magazine.


A live-action Star Wars television series was George Lucas’ dream for many years, but the logistics of television production made achieving the necessary scope and scale seem inconceivable. Star Wars fans would expect exotic, picturesque locations, but it simply wasn’t plausible to take a crew to the deserts of Tunisia or the salt flats of Bolivia on a short schedule and limited budget. The creative team behind The Mandalorian has solved that problem.



For decades, green- and bluescreen compositing was the go-to solution for bringing fantastic environments and actors together on the screen. (Industrial Light & Magic did pioneering work with the technology for the original Star Wars movie.) However, when characters are wearing highly reflective costumes, as is the case with Mando (Pedro Pascal), the title character of The Mandalorian, the reflection of green- and bluescreen in the wardrobe causes costly problems in post-production. In addition, it’s challenging for actors to perform in a “sea of blue,” and for key creatives to have input on shot designs and composition.


AC, Feb. 2020
This story was originally published in the Feb. 2020 issue of AC. Some images are additional or alternate.

In order for The Mandalorian to work, technology had to advance enough that the epic worlds of Star Wars could be rendered on an affordable scale by a team whose actual production footprint would comprise a few soundstages and a small backlot. An additional consideration was that the typical visual-effects workflow runs concurrent with production, and then extends for a lengthy post period. Even with all the power of contemporary digital visual-effects techniques and billions of computations per second, the process can take up to 12 hours or more per frame. With thousands of shots and multiple iterations, this becomes a time-consuming endeavor. The Holy Grail of visual effects — and a necessity for The Mandalorian, according to co-cinematographer and co-producer Greig Fraser, ASC, ACS — was the ability to do real-time, in-camera compositing on set.


“That was our goal,” says Fraser, who had previously explored the Star Wars galaxy while shooting Rogue One: A Star Wars Story (AC Feb. ’17). “We wanted to create an environment that was conducive not just to giving a composition line-up to the effects, but to actually capturing them in real time, photo-real and in-camera, so that the actors were in that environment in the right lighting — all at the moment of photography.”


The solution was what might be described as the heir to rear projection — a dynamic, real-time, photo-real background played back on a massive LED video wall and ceiling, which not only provided the pixel-accurate representation of exotic background content, but was also rendered with correct camera positional data.

“One of the big problems of shooting blue- and greenscreen composite photography is the interactive lighting.
— Greig Fraser, ASC, ACS


Mando with the Child on his ship.

If the content was created in advance of the shoot, then photographing actors, props and set pieces in front of this wall could create final in-camera visual effects — or “near” finals, with only technical fixes required, and with complete creative confidence in the composition and look of the shots. On The Mandalorian, this space was dubbed “the Volume.” (Technically, a “volume” is any space defined by motion-capture technology.)


This concept was initially proposed by Kim Libreri of Epic Games while he was at Lucasfilm and it has become the basis of the technology that “Holy Grail” that makes a live-action Star Wars television series possible.


In 2014, as Rogue One was ramping up, the concept of real-time compositing was once again discussed. Technology had matured to a new level. Visual effects supervisor John Knoll had an early discussion with Fraser about this concept and the cinematographer brought up the notion of utilizing a large LED screen as a lighting instrument to incorporate interactive animated lighting on the actors and sets during composite photography utilizing playback of rough previsualized effects on the LED screens. The final animated VFX would be added in later; the screens were merely to provide interactive lighting to match the animations.


“One of the big problems of shooting blue- and greenscreen composite photography is the interactive lighting,” offers Fraser. “Often, you're shooting real photography elements before the backgrounds are created and you're imagining what the interactive lighting will do — and then you have to hope that what you've done on set will match what happens in post much later on. If the director changes the backgrounds in post, then the lighting isn't going to match and the final shot will feel false.”

Director and executive producer Dave Filoni and cinematographers Greig Fraser, ASC, ACS (center) and Barry “Baz” Idoine (operating camera) on the set.
Director and executive producer Dave Filoni and cinematographers Greig Fraser, ASC, ACS (center) and Barry “Baz” Idoine (operating camera) on the set.

For Rogue One, they built a large cylindrical LED screen and created all of the backgrounds in advance for the space battle landings on Scarif, Jedha and Eadu and all the cockpit sequences in X-Wing and U-Wing spacecraft were done in front of that LED wall as the primary source of illumination on the characters and sets. Those LED panels had a pixel pitch of 9mm (the distance between the centers of the RGB pixel clusters on the screen). Unfortunately, with the size of the pixel pitch, they could rarely get it far enough away from the camera to avoid moiré and make the image appear photo-real, so it was used purely for lighting purposes. However, because the replacement backgrounds were already built and utilized on set — the comps were extremely successful and perfectly matched the dynamic lighting.


In 2016, Lucasfilm president Kathleen Kennedy approached writer-director Jon Favreau about a potential project.


A fisheye view looking through the gap between the two back walls of the show’s LED-wall system, known as “the Volume.” The dark spot on the Volume ceiling is due to a different model of LED screens used there. The ceiling is mostly used for lighting purposes, and if seen on camera is replaced in post.

“I went to see Jon and ask him if we would like to do something for Disney’s new streaming service,” Kennedy says. “I’ve known that Jon has wanted to do a Star Wars project for a long time, so we started talking right away about what he could do that would push technology and that led to a whole conversation around what could change the production path; what could actually create a way in which we could make things differently?”


Favreau had just completed The Jungle Book and was embarking on The Lion King for Disney — both visual-effects heavy films.


Visual effects supervisor Richard Bluff and executive creative director and head of ILM Rob Bredow showed Favreau a number of tests that ILM had conducted including the technology of the LED wall from Rogue One. Fraser suggested with the advancements in LED technology since Rogue One that this project could leverage new panels and push the envelope on real-time, in-camera visual effects. Favreau loved the concept and decided that was the production path to take.

“I was very encouraged by my experiences using similar technology on Jungle Book and using virtual cameras on The Lion King. I had also experimented with a partial video wall for the pilot episode of The Orville.”
— Jon Favreau, series creator and executive producer


In the background, appearing to float in space, are the motion-tracking cameras peeking between the Volume’s wall and ceiling.

The production was looking to minimize the amount of green- and bluescreen photography and requirements of post compositing to improve the quality of the environment for the actors. The LED screen provides a convincing facsimile of a real set/location and avoids the green void that can be challenging for performers.


“I was very encouraged by my experiences using similar technology on Jungle Book [AC, May ’16], and using virtual cameras on The Lion King [AC, Aug. ’19],” explains Favreau, series creator and executive producer. “I had also experimented with a partial video wall for the pilot episode of The Orville. With the team we had assembled between our crew, ILM, Magnopus, Epic Games, Profile Studios and Lux Machina, I felt that we had a very good chance at a positive outcome.”


“The Volume is a difficult technology to understand until you stand there in front of the ‘projection’ on the LED screen, put an actor in front of it, and move the camera around,” Fraser says. “It’s hard to grasp. It’s not really rear projection; it’s not a TransLite because [it is a real-time, interactive image with 3D objects] and has the proper parallax; and it’s photo-real, not animated, but it is generated through a gaming engine.”


Idoine (left) shooting on the Volume’s display of the ice-planet Maldo Kreis — one of many of the production’s environment “loads” — with director Filoni watching and Karina Silva operating B camera. The fixtures with white, half-dome, ping-pong-style balls on each camera are the “Sputniks” — infrared-marker configurations that are seen by the motion-tracking cameras to record the production camera’s position in 3D space, and to render proper 3D parallax on the Volume wall.

“The technology that we were able to innovate on The Mandalorian would not have been possible had we not developed technologies around the challenges of Jungle Book and Lion King,” offers Favreau. “We had used game-engine and motion-capture [technology] and real-time set extension that had to be rendered after the fact, so real-time render was a natural extension of this approach.”


Barry “Baz” Idoine, who worked with Fraser for several years as a camera operator and second-unit cinematographer on features including Rogue One and Vice (AC Jan. ’19), assumed cinematography duties on The Mandalorian when Fraser stepped away to shoot Denis Villeneuve’s Dune. Idoine observes, “The strong initial value is that you’re not shooting in a green-screen world and trying to emulate the light that will be comped in later — you’re actually shooting finished product shots. It gives the control of cinematography back to the cinematographer.”


The Volume was a curved, 20'-high-by-180'-circumference LED video wall, comprising 1,326 individual LED screens of a 2.84mm pixel pitch that created a 270-degree semicircular background with a 75'-diameter performance space topped with an LED video ceiling, which was set directly onto the main curve of the LED wall.


At the rear of the Volume, in the 90 remaining degrees of open area, essentially “behind camera,” were two 18'-high-by-20'-wide flat panels of 132 more LED screens. These two panels were rigged to traveler track and chain motors in the stage’s perms, so the walls could be moved into place or flown out of the way to allow better access to the Volume area.


“The Volume allows us to bring many different environments under one roof,” says visual-effects supervisor Richard Bluff of ILM. “We could be shooting on the lava flats of Nevarro in the morning and in the deserts of Tatooine in the afternoon. Of course, there are practical considerations to switching over environments, but we [typically did] two environments in one day.”

The crew surrounds the Mandalorian’s spacecraft Razor Crest. Only the fuselage and cockpit are practical set pieces. From this still-camera position, the composition appears “broken,” but from the production camera's perspective, the engines appear in perfect relationship to the fuselage, and track in parallax with the camera’s movement.
The crew surrounds the Mandalorian’s spacecraft Razor Crest. Only the fuselage and cockpit are practical set pieces. From this still-camera position, the composition appears “broken,” but from the production camera's perspective, the engines appear in perfect relationship to the fuselage, and track in parallax with the camera’s movement.

“A majority of the shots were done completely in camera,” Favreau adds. “And in cases where we didn’t get to final pixel, the postproduction process was shortened significantly because we had already made creative choices based on what we had seen in front of us. Postproduction was mostly refining creative choices that we were not able to finalize on the set in a way that we deemed photo-real.”


With traditional rear projection (and front projection), in order for the result to look believable, the camera must either remain stationary or move along a preprogrammed path to match the perspective of the projected image. In either case, the camera’s center of perspective (the entrance pupil of the lens, sometimes referred to — though incorrectly — as the nodal point) must be precisely aligned with the projection system to achieve proper perspective and the effects of parallax. The Mandalorian is hardly the first production to incorporate an image-projection system for in-camera compositing, but what sets its technique apart is its ability to facilitate a moving camera.


In the pilot episode, the Mandalorian (Pedro Pascal) brings his prey (Horatio Sanz) into custody.

Indeed, using a stationary camera or one locked into a pre-set move for all of the work in the Volume was simply not acceptable for the needs of this particular production. The team therefore had to find a way to track the camera’s position and movement in real-world space, and extrapolate proper perspective and parallax on the screen as the camera moved. This required incorporating motion-capture technology and a videogame engine — Epic Games’ Unreal Engine — that would generate proper 3D parallax perspective in real time. 


The locations depicted on the LED wall were initially modeled in rough form by visual-effects artists creating 3D models in Maya, to the specs determined by production designer Andrew Jones and visual consultant Doug Chiang. Then, wherever possible, a photogrammetry team would head to an actual location and create a 3D photographic scan. 


“We realized pretty early on that the best way to get photo-real content on the screen was to photograph something,” attests Visual Effects Supervisor Richard Bluff. 


As amazing and advanced as the Unreal Engine’s capabilities were, rendering fully virtual polygons on-the-fly didn’t produce the photo-real result that the filmmakers demanded. In short, 3-D computer-rendered sets and environments were not photo-realistic enough to be utilized as in-camera final images. The best technique was to create the sets virtually, but then incorporate photographs of real-world objects, textures and locations and map those images onto the 3-D virtual objects. This technique is commonly known as tiling or photogrammetry. This is not necessarily a unique or new technique, but the incorporation of photogrammetry elements achieved the goal of creating in-camera finals. 


The Mandolorian makes repairs with a rich landscape displayed behind him.

Additionally, photographic “scanning” of a location, which incorporates taking thousands of photographs from many different viewpoints to generate a 3-D photographic model, is a key component in creating the virtual environments. 


Enrico Damm became the Environment Supervisor for the production and led the scanning and photogrammetry team that would travel to locations such as Iceland and Utah to shoot elements for the Star Wars planets. 


The perfect weather condition for these photographic captures is a heavily overcast day, as there are little to no shadows on the landscape. A situation with harsh sunlight and hard shadows means that it cannot easily be re-lit in the virtual world. In those cases, software such as Agisoft De-Lighter was used to analyze the photographs for lighting and remove shadows to result in a more neutral canvas for virtual lighting. 


Scanning is a faster, looser process than photogrammetry and it is done from multiple positions and viewpoints. For scanning, the more parallax introduced, the better the software can resolve the 3-D geometry. Damm created a custom rig where the scanner straps six cameras to their body which all fire simultaneously as the scanner moves about the location. This allows them to gather six times the images in the same amount of time — about 1,800 on average. 


Photogrammetry is used to create virtual backdrops and images must be shot on a nodal rig to eliminate parallax between the photos. For Mandalorian, about 30-40% of the Volume’s backdrops were created via virtual backdrops — photogrammetry images. 


Each phase of photography — photogrammetry and scanning — needs to be done at various times during the day to capture different looks to the landscape. 


Lidar scanning systems are sometimes also employed.


The cameras used for scanning were Canon EOS 5D MKIV and EOS 5DS with prime lenses. Zooms are sometimes incorporated as modern stitching software has gotten better about solving multiple images from different focal lengths. 

The Mandalorian (aka “Mando,” played by Pedro Pascal) treks through the desert alone.
The Mandalorian (aka “Mando,” played by Pedro Pascal) treks through the desert alone.

This information was mapped onto 3D virtual sets and then modified or embellished as necessary to adhere to the Star Wars design aesthetic. If there wasn’t a real-world location to photograph, the environments were created entirely by ILM’s “environments” visual-effects team. The elements of the locations were loaded into the Unreal Engine video game platform, which provided a live, real-time, 3D environment that could react to the camera’s position. 


The third shot of Season 1’s first episode demonstrates this technology with extreme effectiveness. The shot starts with a low angle of Mando reading a sensor on the icy planet of Maldo Kreis; he stands on a long walkway that stretches out to a series of structures on the horizon. The skies are full of dark clouds, and a light snow swirls around. Mando walks along the trail toward the structures, and the camera booms up.


All of this was captured in the Volume, in-camera and in real time. Part of the walkway was a real, practical set, but the rest of the world was the virtual image on the LED screen, and the parallax as the camera boomed up matched perfectly with the real set. The effect of this system is seamless. 


Because of the enormous amount of processing power needed to create this kind of imagery, the full 180' screen and ceiling could not be rendered high-resolution, photo-real in real time. The compromise was to enter the specific lens used on the camera into the system, so that it rendered a photo-real, high-resolution image based on the camera’s specific field of view at that given moment, while the rest of the screen displayed a lower-resolution image that was still effective for interactive lighting and reflections on the talent, props and physical sets. (The simpler polygon count facilitated faster rendering times.)


Idoine (far left) discusses a shot of “the Child” (aka “Baby Yoda”) with director Rick Famuyiwa (third from left) and series creator/executive producer Jon Favreau (third from right), while assistant director Kim Richards (second from right, standing) and crewmembers listen. Practical set design was often used in front of the LED screen, and was designed to visually bridge the gap between the real and virtual space. The practical sets were frequently placed on risers to lift the floor and better hide the seam of the LED wall and stage floor.

Each Volume load was put into the Unreal Engine video game platform, which provided the live, real-time, 3D environment that reacted to the production camera’s position — which was tracked by Profile Studios’ motion-capture system via infrared (IR) cameras surrounding the top of the LED walls that monitored the IR markers mounted to the production camera. When the system recognized the X, Y, Z position of the camera, it then rendered proper 3D parallax for the camera’s position in real time. That was fed from Profile into ILM’s proprietary StageCraft software, which managed and recorded the information and full production workflow as it, in turn, fed the images into the Unreal Engine. The images were then output to the screens with the assistance of the Lux Machina team. 


It takes 11 interlinked computers to serve the images to the wall. Three processors are dedicated to real-time rendering and four servers provide three 4K images seamlessly side-by-side on the wall and one 4K image on the ceiling. That delivers an image size of 12,288 pixels wide by 2,160 high on the wall and 4,096 x 2,160 on the ceiling. With that kind of imagery, however, the full 270 degrees (plus movable back LED walls) and ceiling cannot be rendered high-resolution photo-real in real time. The compromise is to enter in the specific lens used on the camera into the system so that it renders a photo-real high-resolution image only for the camera's specific field of view at that given moment while the rest of the screen displays a lower-resolution image that is perfectly effective for interactive lighting and reflections on the talent, props and physical sets, but of a simpler polygon count for faster rendering times.


Mando stands in a canyon on the planet Arvala. The rocks behind him are on the LED wall, while some practical rocks are placed in the mid- and foreground to blend the transition. The floor of the stage is covered in mud and rocks for this location. On the jib is an Arri Alexa LF with a Panavision Ultra Vista anamorphic lens.

Due to the 10-12 frames (roughly half a second) of latency from the time Profile’s system received camera-position information to Unreal’s rendering of the new position on the LED wall, if the camera moved ahead of the rendered frustum (a term defining the virtual field of view of the camera) on the screen, the transition line between the high-quality perspective render window and the lower-quality main render would be visible. To avoid this, the frustum was projected an average of 40-percent larger than the actual field of view of the camera/lens combination, to allow some safety margin for camera moves. In some cases, if the lens’ field of view — and therefore the frustum — was too wide, the system could not render an image high-res enough in real time; the production would then use the image on the LED screen simply as lighting, and composite the image in post [with a greenscreen added behind the actors]. In those instances, the backgrounds were already created, and the match was seamless because those actual backgrounds had been used at the time of photography [to light the scene].


Fortunately, says Fraser, Favreau wanted The Mandalorian to have a visual aesthetic that would match that of the original Star Wars. This meant a more “grounded” camera, with slow pans and tilts, and non-aggressive camera moves — an aesthetic that helped to hide the system latency. “In addition to using some of the original camera language in Star Wars, Jon is deeply inspired by old Westerns and samurai films, so he also wanted to borrow a bit from those, especially Westerns,” Fraser notes. “The Mandalorian is, in essence, a gunslinger, and he’s very methodical. This gave us a set of parameters that helped define the look of the show. At no point will you see an 8mm fisheye lens in someone’s face. That just doesn’t work within this language. 


“It was also of paramount importance to me that the result of this technology not just be ‘suitable for TV,’ but match that of major, high-end motion pictures,” Fraser continues. “We had to push the bar to the point where no one would really know we were using new technology; they would just accept it as is. Amazingly, we were able to do just that.” 

Steadicam operator Simon Jayes tracks Mando, Mayfeld (Bill Burr) and Ran Malk (Mark Boone Jr.) in front of the LED wall. While the 10- to 12-frame latency of rendering the high-resolution “frustum” on the wall can be problematic, Steadicam was employed liberally in Episode 6 to great success.
Steadicam operator Simon Jayes tracks Mando, Mayfeld (Bill Burr) and Ran Malk (Mark Boone Jr.) in front of the LED wall. While the 10- to 12-frame latency of rendering the high-resolution “frustum” on the wall can be problematic, Steadicam was employed liberally in Episode 6 to great success.

Shot on Arri’s Alexa LF, The Mandalorian was the maiden voyage for Panavision’s full-frame Ultra Vista 1.65x anamorphic lenses. The 1.65x anamorphic squeeze allowed for full utilization of the 1.44:1 aspect ratio of the LF to create a 2.37:1 native aspect ratio, which was only slightly cropped to 2.39:1 for exhibition. 


“We chose the LF for a couple reasons,” explains Fraser. “Star Wars has a long history of anamorphic photography, and that aspect ratio is really key. We tested spherical lenses and cropping to 2.40, but it didn’t feel right. It felt very contemporary, not like the Star Wars we grew up with. Additionally, the LF’s larger sensor changes the focal length of the lens that we use for any given shot to a longer lens and reduces the overall depth of field. The T2.3 of the Ultra Vistas is more like a T0.8 in Super 35, so with less depth of field, it was easier to put the LED screen out of focus faster, which avoided a lot of issues with moiré. It allows the inherent problems in a 2D screen displaying 3D images to fall off in focus a lot faster, so the eye can’t tell that those buildings that appear to be 1,000 feet away are actually being projected on a 2D screen only 20 feet from the actor.


Fraser operates an Alexa LF, shooting a close-up of the Ugnaught Kuiil (Misty Rosas in the suit, voiced by Nick Nolte). The transition between the bottom of the LED wall and the stage floor is clearly seen here. That area was often obscured by physical production design or replaced in post.

“The Ultra Vistas were a great choice for us because they have a good amount of character and softness,” Fraser continues. “Photographing the chrome helmet on Mando is a challenge — its super-sharp edges can quickly look video-like if the lens is too sharp. Having a softer acutance in the lens, which [Panavision senior vice president of optical engineering and ASC associate] Dan Sasaki [modified] for us, really helped. The lens we used for Mando tended to be a little too soft for human faces, so we usually shot Mando wide open, compensating for that with ND filters, and shot people 2⁄3 stop or 1 stop closed.” 


According to Idoine, the production used 50mm, 65mm, 75mm, 100mm, 135mm, 150mm and 180mm Ultra Vistas that range from T2 to T2.8, and he and Fraser tended to expose at T2.5-T3.5. “Dan Sasaki gave us two prototype Ultra Vistas to test in June 2018,” he says, “and from that we worked out what focal-length range to build.


Director Bryce Dallas Howard confers with actress Gina Carano — as mercenary Cara Dune — while shooting the episode “Chapter 4: Sanctuary.”

“Our desire for cinematic imagery drove every choice,” Idoine adds. And that included the incorporation of a LUT emulating Kodak’s short-lived 500T 5230 color negative, a favorite of Fraser’s. “I used that stock on Killing Them Softly [AC Oct. ’12] and Foxcatcher [AC Dec. ’14], and I just loved its creamy shadows and the slight magenta cast in the highlights,” says Fraser. “For Rogue One, ILM was able to develop a LUT that emulated it, and I’ve been using that LUT ever since.”


Foxcatcher was the last film I shot on the stock, and then Kodak discontinued it,” continues Fraser. “At the time, we had some stock left over and I asked the production if we could donate it to an Australian film student and they said ‘yes,’ so we sent several boxes to Australia. When I was prepping Rogue One, I decided that was the look I wanted — this 5230 stock — but it was gone. On a long shot, I wrote an email to the film student to see if he had any stock left and, unbelievably, he had 50 feet in the bottom of his fridge. I had him send that directly to ILM and they created a LUT from it that I used on Rogue and now Mandalorian.”


Actor Giancarlo Esposito as Moff Gideon, an Imperial searching for the Child.

A significant key to the Volume’s success creating in-camera final VFX is color matching the wall’s LED output with the color matrix of the Arri Alexa LF camera. ILM’s Matthias Scharfenberg, J. Schulte and their team did thorough testing of the Black Roe LED capabilities and matching that with the color sensitivity and reproduction of the LF to make them seamless partners. LEDs are very narrow band color spectrum emitters, their red, green and blue diodes output very narrow spectra of color for each diode which makes reaching some colors very difficult and further making them compatible with the color filter array on the ALEV-III was a bit of a challenge. Utilizing a carefully designed series of color patches, a calibration sequence was run on the LED wall to sync with the camera’s sensitivity. This means any other model of camera shooting on the Volume will not receive proper color, but the Alexa LF will. While the color reproduction of the LEDs may not have looked right to the eye, through the camera, it appeared seamless. This means that the off-the-shelf LED panels won’t quite work with the accuracy necessary for a high-end production, but, with custom tweaking, they were successful. There were limitations, however. With low light backgrounds, the screens would block up and alias in the shadows making them unsuitable for in-camera finals — although with further development of the color science this has been solved for season two.


A significant asset to the LED Volume wall and images projected from it is the interactive lighting provided on the actors, sets and props within the Volume. The light that is projected from the imagery on the LED wall provides a realistic sense of the actor (or set/props) being within that environment in a way that is rarely achievable with green- or bluescreen composite photography. If the sun is low on the horizon on the LED wall, the position of the sun on the wall will be significantly brighter than the surrounding sky. This brighter spot will create a bright highlight on the actors and objects in the Volume just as a real sun would from that position. Reflections of elements of the environment from the walls and ceiling show up in Mando’s costume as if he were actually in that real-world location.

The Razor Crest sits on the Volume stage. Only the fuselage of the ship is practical.

“When you’re dealing with a reflective subject like Mando, the world outside the camera frame is often more important than the world you see in the camera’s field of view,” Fraser says. “What’s behind the camera is reflected in the actor’s helmet and costume, and that’s crucial to selling the illusion that he’s in that environment. Even if we were only shooting in one direction on a particular location, the virtual art-department would have to build a 360-degree set so we could get the interactive lighting and reflections right. This was also true for practical sets that were built onstage and on the backlot — we had to build the areas that we would never see on camera because they would be reflected in the suit. In the Volume, it’s this world outside the camera that defines the lighting. 


“When you think about it, unless it’s a practical light in shot, all of our lighting is outside the frame — that’s how we make movies,” Fraser continues. “But when most of your lighting comes from the environment, you have to shape that environment carefully. We sometimes have to add a practical or a window into the design, which provides our key light even though we never see that [element] on camera.”


The fight with the mudhorn likely negated any worry about helmet reflections for this scene.

The interactive lighting of the Volume also significantly reduces the requirement for traditional film production lighting equipment and crew. The light emitted from the LED screens becomes the primary lighting on the actors, sets and props within the Volume. Since this light comes from a virtual image of the set or location, the organic nature of the quality of the light on the elements within the Volume firmly ground those elements into the reality presented. 


There were, of course, limitations. Although LEDs are bright and capable of emitting a good deal of light, they cannot re-create the intensity and quality of direct, natural daylight. “The sun on the LED screen looks perfect because it’s been photographed, but it doesn’t look good on the subjects — they look like they’re in a studio,” Fraser attests. “It’s workable for close-ups, but not really for wide shots. For moments with real, direct sunlight, we headed out to the backlot as much as possible.” That “backlot” was an open field near the Manhattan Beach Studios stages, where the art department built various sets. (Several stages were used for creating traditional sets as well.)


Overcast skies, however, proved a great source in the Volume. The skies for each “load” — the term given for each new environment loaded onto the LED walls — were based on real, photographed skies. While shooting a location, the photogrammetry team shot multiple stills at different times of day to create “sky domes.” This enabled the director and cinematographer to choose the sun position and sky quality for each set. “We can create a perfect environment where you have two minutes to sunset frozen in time for an entire 10-hour day,” Idoine notes. “If we need to do a turnaround, we merely rotate the sky and background, and we’re ready to shoot!”


Idoine (seated at camera) in discussion with Favreau and Filoni on a practical set.

During prep, Fraser and Idoine spent a lot of time in the virtual art department, whose crew created the virtual backgrounds for the LED loads. They spent many hours going through each load to set sky-dome choices and pick the perfect time of day and sun position for each moment. They could select the sky condition they wanted, adjust the scale and the orientation, and finesse all of these attributes to find the best lighting for the scene. Basic, real-time ray tracing helped them see the effects of their choices on the virtual actors in the previs scene. These choices would then be saved and sent off to ILM, whose artists would use these rougher assets for reference and build the high-resolution digital assets. 


The Virtual Art Department starts their job creating 3-D virtual sets of each location to production designer Andrew Jones’ specifications and then the director and cinematographer can go into the virtual location with VR headsets and do a virtual scout. Digital actors, props and sets are added and can be moved about and coverage is chosen during the virtual scout. Then the cinematographer will follow the process as the virtual set gets further textured with photogrammetry elements and the sky domes are added. 


The virtual world on the LED screen is fantastic for many uses, but obviously an actor cannot walk through the screen, so an open doorway doesn't work when it's virtual. Doors are an aspect of production design that have to be physical. If a character walks through a door, it can’t be virtual, it must be real as the actor can’t walk through the LED screen. 


Favreau gets his western-style saloon entrance from the first episode of The Mandalorian.

If an actor is close to a set piece, it is more often preferred that piece be physical instead of virtual. If they’re close to a wall, that should be a physical wall so that they are actually close to something real. 


Many objects that are physical are also virtual. Even if a prop or set piece is physically constructed, it is scanned and incorporated into the virtual world so that it becomes not only a practical asset, but a digital one as well. Once it’s in the virtual world, it can be turned on or off on a particular set or duplicated. 


“We take objects that the art department have created and we employ photogrammetry on each item to get them into the game engine,” explains Clint Spillers, Virtual Production Supervisor. “We also keep the thing that we scanned and we put it in front of the screen and we’ve had remarkable success getting the foreground asset and the digital object to live together very comfortably.” 

Another challenge on production design is the concept that every set must be executed in full 360 degrees. While in traditional filmmaking a production designer may be tempted to shortcut a design knowing that the camera will only see a small portion of a particular set, in this world the set that is off camera is just as important as the set that is seen on camera. 


“This was a big revelation for us early on,” attests production designer Andrew Jones. “We were, initially, thinking of this technology as a backdrop — like an advanced translight or painted backdrop — that we would shoot against and hope to get in-camera final effects. We imagined that we would design our sets as you would on a normal film: IE, the camera sees over here, so this is what we need to build. In early conversations with DP Greig Fraser he explained that the off-camera portion of the set — that might never be seen on camera — was just as vital to the effect. The whole Volume is a light box and what is behind the camera is reflected on the actor’s faces, costumes, props. What’s behind the camera is actually the key lighting on the talent.


IG-11 and Mando encounter their target.

“This concept radically changed how we approach the sets,” Jones continues. “Anything you put in The Volume is lit by the environment, so we have to make sure that we conceptualize and construct the virtual set in its entirety of every location in full 360. Since the actor is, in essence, a chrome ball, he’s reflecting what is all around him so every detail needs to be realized.”


They sometimes used photogrammetry as the basis, but always relied upon the same visual-effects artists who create environments for the Star Wars films to realize these real-time worlds — “baking in” lighting choices established earlier in the pipeline with high-end, ray-traced rendering. 


“I chose the sky domes that worked best for all the shots we needed for each sequence on the Volume,” Fraser notes. “After they were chosen and ILM had done their work, I couldn’t raise or lower the sun because the lighting and shadows would be baked in, but I could turn the whole world to adjust where the hot spot was.”


Fraser noted a limitation of the adjustments that can be made to the sky domes once they’re live on the Volume after ILM’s finalization. The world can be rotated and the center position can be changed; the intensity and color can be adjusted, but the actual position of the sun in the sky dome can’t be altered because ILM has done the ray tracing ahead of time and “baked” in the shadows of the terrain by the sun position. This is done to minimize the computations necessary to do advanced ray tracing in real time. If the chosen position changes, those baked-in shadows won’t change, only the elements that are reserved for real-time rendering and simple ray tracing will be affected. This would make the backgrounds look false and fake as the lighting direction wouldn’t match the baked-in shadows. 



From time to time, traditional lighting fixtures were added to augment the output of the Volume. 


In the fourth episode, the Mandalorian is looking to lay low and travels to the remote farming planet of Sorgan and visits the common house, which is a thatched, basket-weave structure. The actual common house was a miniature built by the art department and then photographed to be included in the virtual world. The miniature was lit with a single, hard light source that emulated natural daylight breaking through the thatched walls. “You could clearly see that one side of the common house was in hard light and the other side was in shadow,” recalls Idoine. “There were hot spots in the model that really looked great so we incorporated LED “movers” with slash gobos and Charlie Bars [long flags] to break up the light in a similar basket-weave pattern. Because of this very open basket-weave construction and the fact that the load had a lot of shafts of light, I added in random slashes of hard light into the practical set and it mixed really well.” 


The Volume could incorporate virtual lighting, too, via the “Brain Bar,” a NASA Mission Control-like section of the soundstage where as many as a dozen artists from ILM, Unreal and Profile sat at workstations and made the technology of the Volume function. Their work was able to incorporate on-the-fly color-correction adjustments and virtual-lighting tools, among other tweaks. 


Matt Madden, president of Profile and a member of the Brain Bar team, worked closely with Fraser, Idoine and gaffer Jeff Webster to incorporate virtual-lighting tools via an iPad that communicated back to the Bar. He could create shapes of light on the wall of any size, color and intensity. If the cinematographer wanted a large, soft source off-camera, Madden was able to create a “light card” of white just outside the frustum. The entire wall outside the camera’s angle of view could be a large light source of any intensity or color that the LEDs could reproduce.



In this case, a LED wall was made up of Roe Black Pearl BP2 screens with a max brightness of 1800 nits. 10.674 nits are equal to 1 foot candle of light. At peak brightness, the wall could create an intensity of about 168 foot candles. That’s the equivalent of an f/8 3/4 at 800 ISO (24fps 180-degree shutter). While the Volume was never shot at peak full white, any lighting “cards” that were added were capable of outputting this brightness.


Idoine discovered that a great additional source for Mando was a long, narrow band of white near the top of the LED wall. “This wraparound source created a great backlight look on Mando’s helmet,” Idoine says. Alternatively, he and Fraser could request a tall, narrow band of light on the wall that would reflect on Mando’s full suit, similar to the way a commercial photographer might light a wine bottle or a car — using specular reflections to define shape.


Additionally, virtual black flags — meaning areas where the LED wall were set to black — could be added wherever needed, and at whatever size. The transparency of the black could also be adjusted to any percentage to create virtual nets.


Kuiil on his Blerg.

The virtual LED environments were hugely successful, but traditional greenscreen still played a significant role in the production of The Mandalorian, and it was always on hand — especially for situations where the frustum was too wide for the system to adequately respond. The Volume was also capable of producing virtual greenscreen on the LED wall, which could be any size, and any hue or saturation of green. Among the benefits of virtual green-screen were that it required no time to set up or rig, and its size could be set to precisely outline the subject to be replaced — which greatly minimized and sometimes even eliminated green spill.


There are many benefits to the virtual greenscreen. It is nearly immediate, requiring no rigging, stands, time to set up or additional lighting. It can be any size, meaning the green is really only the perfect size to outline the subject to be replaced. When this happens there is no (or extremely limited) green spill on the actors or reflected in other objects around the set. This all but eliminates the requirement for de-spilling green in post; a timely and tedious process.


The virtual greenscreens can, of course, only be on the LED wall. If lower portions of an actor’s body or set piece needs to be composited, then physical green screen is required as the floor is not an LED screen and a virtual green cannot extend past the LED wall.


When green is employed, live compositing is possible for the camera operator’s on-board monitor and the director’s monitor so that they can see the elements that will be composited into the shot and compose the framing accordingly.


The Mandalorian workflow was somewhat inverted, because — unlike on typical productions — virtual backgrounds and CG elements had to be finished before principal photography commenced. Once the cinematographer approved the locations and lighting in the virtual art-department, the images were delivered to ILM for their work, which took about six weeks to complete for each load. At the time of photography, some manipulation and alteration of the virtual elements could take place, but many decisions about coverage, blocking and lighting were already locked in. Naturally, this required a degree of collaboration among the director, cinematographer, production designer and visual-effects supervisor that was closer than that on a typical production. But, as Fraser notes, it also meant that the cinematographer was hands-on throughout the creation of all the images.


“In today’s production workflow, the cinematographer comes in to prep the film and then shoots and then is sent away until the grading process — so much work with the image happens in post that we’re not a part of,” asserts Fraser. “This inverted production workflow of The Mandalorian keeps the cinematographer in the loop from the development of the image to the final image, which is often captured in-camera. Baz and I are there to shepherd the image through the whole process and this is so supremely important. Cinematographers work to design imagery every day, 12 hours a day, and we know how to look at an image and know immediately if its right or wrong. Visual effects artists have amazing skills, but they don’t always have the photographic experience to know what is right or wrong and a lot of times what we plan and photograph doesn’t get translated through post. With this kind of workflow we supervise every element of the shot and have a much closer partnership with visual effects to make sure that it works with what we and the director planned and executed on set. We get that final shot in camera and the result is pretty amazing.”


“I personally enjoy that pipeline,” Favreau attests. “I have tried to learn as much as I could from the way animation approaches the pre-production and production schedule. I think the earlier in the process you can solve story issues, the more efficient the production process becomes. Animation has traditionally front-loaded the story process, whereas live-action allows you to kick the can down the road.”


The Bounty Hunter IG-11 is after the asset.

The ability to select the perfect lighting conditions would seem to allow a cinematographer the ability to create the perfect look for every shot. How wonderful would it be to have magic hour all day long; or even all week long for that matter?! Yet Fraser is keenly aware of making things too perfect and introducing an unnecessary artifice to the overall visual style of the show. “I won’t always want it to be a perfect backlight, because that ends up looking fake,” Fraser attests. “It’s not real. In the real world, we have to shoot at different times and we have to compromise a little, so if I build in a little bit of ‘roughness,’ it will feel more real and less fake. The idea is to introduce a little ‘analog’ to make the digital look better, to make it feel more real and make the effect even more seamless as if it were all real locations.


“That’s where the system is very good,” Fraser continues. “It allows you to see what you're photographing in real-time. There are times in the real world where you don’t have a choice and you have to shoot something front-lit, but you still work to make it look pleasing. You shoot an exterior for four hours and the sun moves constantly and you get locked into those couple shots that aren’t perfectly backlit — but that’s reality. When you have the ability to make it perfect for every shot, that doesn’t feel right, so we had to really think about building in some analog. Jon was really keenly aware of this, as well. He had just finished doing The Lion King with Caleb [Deschanel, ASC] and they had scenes that they would stage in direct hard noon sunlight to give the film a more realistic feeling instead of just doing everything perfectly at golden hour — that just feels false.”


“We all felt a little like film students at the start of this,” Fraser says. “It’s all new, and we were discovering the limitations and abilities of the system as we went along. We continually pushed the system to break it and see where the edges of the envelope were — but the technology continued to evolve and allow us to push that envelope further. We’d say, ‘Oh, man, I wish we could …’ and someone at the Brain Bar would say, ‘Yeah, I think we can!’ And the next day we’d have that ability. It was pretty amazing.”

Idoine readies the camera for a scene.
Idoine readies the camera for a scene.


Frasier and Idoine earned 2020 Emmy nominations for their work on this series, and spoke to interviewer Larry Sher, ASC about their process in this Clubhouse Conversations video:



Idoine was later invited to become a member of the ASC.

TECH SPECS
2.39:1 Anamorphic
Digital Capture
Arri Alexa LF
LF Open Gate, ArriRaw, 4.5K
Panavision Ultra Vista, 1.65x squeeze

Gina Carano is introduced as Cara Dune




Subscribe Today

Act now to receive 12 issues of the award-winning AC magazine — the world’s finest cinematography resource.

December 2024 AC Magazine Cover
November 2024 AC Magazine Cover
October 2024 AC Magazine Cover