Feature

Mastering the Elements for Avatar: The Last Airbender

Michael Goi, ASC, ISC; Michael Balfry, CSC; and Stewart Whelan, ISC employ traditional and cutting-edge techniques to adapt a fantastical tale.

By Jay Holben | Interviews by Noah Kadner

Michael Goi, ASC, ISC was not planning to be one of the series’ cinematographers when he signed onto Avatar: The Last Airbender as an executive producer and producing director, but the need for such expertise became apparent very early on in prep — well before any cinematographers had been hired.


The Netflix production is a live-action adaptation of the popular animated series by Michael Dante DiMartino and Bryan Konietzko, and it called for a combination of backlot, traditional-soundstage and Vancouver-based location work — as well as additional capture on an 80'x27.5' ICVFX stage built by Pixomondo and William F. White International at the Canadian Motion Picture Park in British Columbia.


“I never went into Avatar thinking that everything was going to be a huge digital effect. How we shot the show was going to be, for me, a mix of high technology and cut-and-paste filmmaking.”
— Michael Goi, ASC, ISC

For the ICVFX work, “some of the visual assets had to be lit approximately one year before the show was to be filmed,” Goi recalls. “I emphasized to production that a cinematographer needed to fulfill that responsibility. I said, ‘We can’t light the environments without a cinematographer.’”


Series showrunner Albert Kim, who brought him onto the project, suggested Goi could provide that expertise. The cinematographer recalls responding, “Only if I’m credited, paid and acknowledged as being the director of photography.


“It was important to me to make the statement that you need to have a cinematographer involved at the very beginning of a virtual production, when the lighting is being designed,” he continues. “You don’t want the cinematographer to come in later and be relegated to simply duplicating visual approaches that were arbitrarily made long before they were hired. Cinematographers have a creative contribution, and it should come at the beginning of the creative cycle. Happily, the production agreed.”


Cinematographers Stewart Whelan, ISC (left) and Michael Balfry, CSC.

Ultimately, Goi directed and shot Episodes 1 and 2, and the series’ cinematography team was rounded out by Michael Balfry, CSC (Episodes 3, 4, 7 and 8) and Stewart Whelan, ISC (Episodes 5 and 6). “I was looking for cinematographers who could embrace the varied styles of the worlds in the show and give it an epic look,” says Goi. “Because of the ages of some of the actors, the cinematographers also needed to be able to work quickly despite the large number of technical challenges. Michael Balfry and Stewart Whelan checked all those boxes and were able to accommodate the checkered prep schedule required by the virtual-production asset development.”


The other directors were Jabbar Raisani (Episodes 3 and 4), who doubled as an executive producer and visual-effects supervisor; co-executive producer Roseanne Liang (Episodes 5 and 6); and Jet Wilkinson (Episodes 7 and 8).


Avatar is set in a fictional realm where humans exist in four nations, each based on an element of nature: Water Tribes, Earth Kingdom, Fire Nation and Air Nomads. Within each faction are “benders,” who can manipulate their respective elements telekinetically. An Avatar is a bender who has mastered all four elements. Twelve-year-old Aang (Gordon Cormier), the last surviving Air Nomad, has been determined to be the next Avatar.


Aang finds his center as Katara (Kiawentiio) and Sokka (Ian Ousley) look on.

Goi, a longtime fan of the animated series, was keen to “make sure we made something the hardcore fans would appreciate and embrace,” he says. “In my pitch session, I talked about the essence of the show and what I wanted to make sure translated into the live-action version — like how the fights were choreographed, how they were composed, and the color palette of certain scenes.


“I always loved in the animated show when there’d be this big fight going on, and they’d go to this high-angle view and you’d see the two different-colored beams of telekinetic elemental energy colliding in the middle,” Goi adds. “Those were things I wanted to make sure were represented in the show.


Traditional Techniques
Another creative choice of Goi’s was to root the visual approach in traditional cinematography methods — some dating even to the silent era. “I’m a huge fan of silent movies, and I’m a big fan of doing things in-camera,” he says. “I never went into Avatar thinking that everything was going to be a huge digital effect. How we shot the show was going to be, for me, a mix of high technology and cut-and-paste filmmaking. Keeping a hands-on approach made me, in some ways, feel closer to what the characters were experiencing.


“Also, as visual processes become more perfected, I love embracing non-perfection as a way of helping the viewer feel that what they are watching was created by people, not machines.”


The statue of Yangchen.

Elaborating on some examples of his “cut-and-paste” approach on Avatar, Goi continues, “I used foreground miniatures in some instances to create a sense of size and scope while also doing the work primarily in the camera. The statue of Yangchen was a bust that was only 2 feet high, but it looks like a 40-foot statue in the final shots. This was accomplished by using a 10mm lens at T22, so that everything from 10 inches to infinity was in sharp focus.


“Similarly, when our heroes travel to the island of Kyoshi in Episode 2, the statue of Kyoshi is intended to be 150 feet tall. To create that impression, we put a miniature bust of the statue on a high-roller stand and used a telescoping crane to circle the head and look down at actors walking below, again using a 10mm lens.”


The statue of Kyoshi

Goi’s love of Buster Keaton movies came to the fore in planning an Episode 1 sequence in which Katara (Kiawentiio) and Sokka (Ian Ousley) are in a boat that gets swept up on a wave and carried into a massive ice cave, where they’re ejected from the boat as it hits an ice shelf.


“In our preproduction talks, I was told that was potentially the most expensive shot in the entire episode,” Goi recalls. “I said, ‘There’s not going to be any CGI in this shot.’ And they said, ‘How can you do that?’ I said, ‘It’s very simple: We have the boat, a physical piece with stunt performers in it; we have the ice shelf, a practical build by production designer Michael Wylie; and we have the ice cave on the LED wall, a virtual-production asset we’ve already accounted for. All we do is put the boat on some wheels, tie a rope to the wheels, have a couple of crewmembers pull the rope, and the boat slides across the studio floor, hits the ice shelf, and then the stunt people are ejected out.’


Executive producer, producing director and cinematographer Michael Goi, ASC, ISC works with Kiawentiio and Ousley on a water sequence that — with the aid of stunt performers — included the use of classic special-effects techniques.

“They said, ‘What about the water?’ And I said, ‘We build a 4-by-4-foot water trough and put it in front of the camera with a 70mm lens with a split diopter on the bottom — probably a +2 — and then we pan across with the water in the foreground and the boat farther back, but both appearing onscreen to be at the same depth within the frame.”


“When they said, ‘What about the wave that propels the boat?’ I said, ‘Two crewmembers with 5-gallon buckets of water stand on the side, and on “Action,” they pour the water into the water trough, creating the wave — and the other crew pull the rope to move the boat.’


“And they were like, ‘Really? This is going to work?’ And I said, ‘Yes, it worked in the 1920s for Buster Keaton. It’ll work!’


“So, that’s what we did, and it ended up being the first shot on the very first day of filming. When we shot it, everybody’s eyes were glued to the monitor. At the end, when the stunt performers got ejected, everybody applauded, and they said, ‘Wow! It’s great that you knew how to do that!’ I said, ‘I’ve never done that before. It seemed like it would work in theory, and it certainly seemed like it would be much more fun than doing it another way!’”

Complementary Anomalies
After testing a variety of cameras, the production chose the Sony Venice, which Goi favored “because I like the color space, and it gave me a built-in level of contrast that I felt was easiest to work with.” The three cinematographers also agreed that shooting anamorphic would “give us the kind of visual anomalies that complemented the feel of the show.”


After choosing Panavision T Series lenses “for their high resolution and visual perfection,” Goi started seeing “a lot of moiré in our virtual-production tests — regardless of focal length, regardless of distance to subject. I was told VFX had a provision in their budget for digital fixes, but I was concerned that this was happening in almost every shot. So, I called [ASC associate and Panavision’s senior vice president of optical engineering and lens strategy] Dan Sasaki. Dan said, ‘Pack up all your lenses and send them to me.’ We did, and he tweaked just the right element at just the right angle and sent them back, and we had virtually no moiré issues from that point on! And we had full resolution, which is what we wanted. It’s not an understatement to say that Dan Sasaki saved the production millions of dollars, because that would have been a lot of digital fixes!”


Aang studies an Avatar predecessor.

The show’s final 2.20:1 aspect ratio was an evolution of sorts. The filmmakers originally wanted to frame in 2.39:1, but Netflix requested 2.0:1 to accommodate their various markets and distribution platforms. “Ultimately,” says Goi, “the show was distributed in 2.20, which helped deliver the big-screen look we wanted without compromising their distribution platforms.


“Light It Like a Normal Show”
Cinematographer Whelan was new to shooting on an LED volume and excited about exploring its potential. First, though, he needed to familiarize himself with Avatar’s source material. “It was super exciting when the project came across my path,” he recalls. “I had to go and watch [several] years’ worth of animations, which was a lot of fun!


“I just loved Michael Goi’s approach to the show,” Whelan continues. “He said, ‘You just light it like you’re lighting a normal show.’ They had the stage in a constant prep/test mode with a full kit and crew there with a camera, so we were able to go in and see through our lenses what the virtual sets would look like on the LED walls.”


He observes that the Panavision T Series anamorphics were particularly well suited to the production. “The combination of a little-bit-older anamorphic glass with the Panavision characteristics and bokeh — the sheer beauty of those lenses — softened the image, but created a very authentic feel for the in-camera composites. I felt like I was living in these environments and able to take the audience to places that we could never actually go with a camera.”


Iroh (Paul Sun-Hyung Lee, left) counsels Prince Zuko (Dallas Liu) in their pursuit of Aang.

On the experience of virtual production and ICVFX with LED walls, cinematographer Balfry says, “It’s really nice to have your set or location there behind you [when you’re shooting], because then you know how it feels. When you’re doing greenscreen, you’ll plan the motivating light, but once [the shot is] out of your hands, you just hope there’s continuity between the lighting and what they create in the background. It was really fun to be able to see our [virtual] sets while we were shooting. I felt like I was part of the whole image process.”


Whelan strove to incorporate as much film lighting as possible. “The screens are very useful, but I think it’s a misconception that you can light [entire scenes] with them,” he says. They’re great for the wide shots, but as you come in, you end up using a lot more traditional lighting.


“The stage ceiling had a beautifully devised latch system where you could hinge up the panels, and above that was a complete lighting rig,” Whelan continues. “We plumbed that rig for every kind of light we wanted — a Fresnel or an LED, and big guns like Creamsource Vortex8s and Arri SkyPanel 360-Cs, four to eight of them in a bank to create volumetrics of light. In addition, [key grip] John Westerlaken put in rigging that allowed us to put up flaps, even in front of the lighting, and drop in diffusion like 40-by-40s, 40-by-20s or 40-by-12s underneath the lighting to soften things in a very natural way.


A look at the ceiling rigging on the production’s main ICVFX stage. (Click to enlarge.)

“Also, the lighting units were built on scissor rigs so they could be lowered in below the ceiling when we needed. We’d use the walls for the wides, and when we went in for close-ups, the rigging would drop down behind us so we could light quickly with movie lighting from lower down without bringing it in on the floor. I didn’t steer away from hard light, either — I just made sure to never pollute the screens with it.”


Maximizing Space
The production’s massive LED virtual-production stage occupied almost two-thirds of the stage space, leaving little room for production designer Wylie and his crew to change out practical sets.


“The general rule of thumb in OSVP stage construction is that the LED volume should occupy only one-third of the total stage space to allow room for production and sets that will be swapped out,” Goi says. “But Vancouver was a very busy town, and stage space was at a premium. Changing out the practical sets became a delicate scheduling issue with the art department, because there were not enough practical locations to jump to while a swap was happening, and [when] you have to change out the entire floor from a forest setting with trees to a snow setting with igloos, it takes time.”


Consequently, the production built a second, smaller LED volume on an adjacent stage and used it primarily to shoot smaller scenes and most of the Appa flying sequences.


A-camera operator Dean Webber shoots a sequence depicting Aang using his Avatar powers.

Offering Insights
Virtual production has grown in popularity in recent years, but Goi is quick to point out that Avatar: The Last Airbender was ultimately a mix of traditional visual-effects work performed after principal photography as well. “Some of the virtual environments were just not ready to be photographed for different reasons, and we shot those scenes on greenscreen,” Goi says. “When you’re creating a show with fantasy locales, it’s not a simple matter of just importing stock backgrounds — everything has to be built from the ground up. So, traditional VFX work still dominates much of the final show.”


On a greenscreen stage, B-camera operator Glen A. Dickson captures a scene featuring Aang in flight.

So where does virtual production fit in the visual toolbox today? “The reality is, cinematographers have been doing virtual production for more than a hundred years,” Goi says. “Every time we’ve had to shoot a window with a fake background outside and make it look real, every time we’ve had to perfectly balance a blue- or greenscreen so things could be composited in there later, we’ve been doing virtual production — and that’s throughout the entire history of motion pictures. The difference today is that the tools are much better, the resolution of the screens is much higher, the color fidelity of what we’re looking at is much more stable, and so on. But the concept is the same.”


Goi adds that his work on this project “was the fulfillment of my initial love and respect for the original animated series. Being able to bring these characters to a new generation of viewers was the opportunity of a lifetime, and I hope the images that I and my fellow cinematographers created will resonate with the audience for years to come.”




Recalling Kurosawa


“Production designer Michael Wylie did a brilliant job on this show — his sets were truly wonderful,” says Goi. “I provided some inspiration for one of them: When we were discussing the look of the village for Episode 2, shot on location [in Maple Ridge, Vancouver, Canada], I mentioned that I’d like it to have the feel of the village in Akira Kurosawa’s Seven Samurai. Michael was very excited by the idea and used the inclined topography of the location for maximum depth, making the look of the houses in line with that classic film. We built the village to take advantage of the natural direction of the light, and then we added hazy fog and the homage was complete!”


Intent on finding the Avatar, Commander Zhao (Ken Leung) and his troops descend upon a village on Kyoshi Island.



VFX and Post


As the scope of Avatar: The Last Airbender evolved during postproduction, executive producer/episode director Jabbar Raisani was asked to help oversee the expanding visual-effects work. “I have a heavy VFX background, coming from shows like Game of Thrones and Stranger Things, and Netflix was looking for some of that expertise as the plan was reshaped in post,” Raisani explains.


The eight episodes of Avatar’s first season included more than 3,400 VFX shots handled by more than 20 vendors. Post-production spanned September 2022 to February 2024. Key providers included Scanline VFX, Image Engine, Framestore and Important Looking Pirates.


Katara learns to use her water-bending abilities.

Some of the series involves fully digital actors and environments, but most of the VFX required live-action elements. For the VFX imagery, Raisani’s crews did their best to emulate the original cinematography. “Michael Goi [ASC, ISC] directed and shot the first block [of episodes], so he set the look,” he notes. “For each subsequent block, we always looked at what each cinematographer brought to the set and tried to match their lighting. Even if we wanted something like brightness to pop on a shot, we’d match the plate first and then address it in the DI with our colorists. If we didn’t match the plate material, our VFX work would inherently stand out differently.”


Another challenge was conforming the VFX to the look of the original animated show. “We’d always look at any sequence that had been established in the original animated series and try to be truthful to the framing they intended in the original 2D space. In post, we’d even have the animated series in the temp cut and then try to create VFX representing the same concepts. We’d send that out to the vendors to communicate the feeling we were looking for.”


The crew captures high angles for a scene in which Aang recovers from his hibernation as Katara and “Gran Gran” (Casey Camp-Horinek) observe.

Real-world references were key in creating effects for the elemental battles in which the benders wield fire, water, earth and air as weapons. “For fire, we tapped a reference library built during research on flamethrowers for Stranger Things,” Raisani says. “Water was more challenging because we needed it to act in ways water doesn’t normally; we looked at footage of how it behaves in zero-g to get ideas.”


Most of the simulations were created using SideFX Houdini combined with proprietary software such as Scanline’s Flowline simulation tool. Nuke performed most of the compositing, and Maya rounded out the majority of the 3D work across the different vendors.


A traditional performance-capture volume was built and used for complex martial-arts sequences, but the team also leveraged Move AI, which offers a camera-based AI-enhanced video mocap tool.


Actor Elizabeth Yu, as Princess Azula, performs a greenscreen-aided maneuver on an ICVFX stage.

Another capture technique involved volumetric methods. “Scanline’s Eyeline is a performance-capture tool that captures video as three-dimensional geometry,” Raisani says. “We used it for a scene during the Agna Qel’a siege. The system is an array of cameras facing inward, so you get live-action performances as 3D models. You can then drop them into a completely CG scene and move the virtual camera around them.”


Raisani was also tasked with conforming all the VFX into a coherent color-finishing pipeline as final delivery approached. “[Creative post council co-lead] Siggy Ferstl at Company 3 set up the show LUT in prep, so every VFX vendor would apply that to their shots, and Siggy used that as a starting point. We tracked the dailies color all the way through post. Siggy also did a lot of pre-grading and could send frames to the vendors for reference; it wasn’t baked in, but it helped everyone understand where their shots would fit into the final.”


— Noah Kadner





Subscribe Today

Act now to receive 12 issues of the award-winning AC magazine — the world’s finest cinematography resource.

November 2024 AC Magazine Cover
October 2024 AC Magazine Cover
September 2024 AC Magazine Cover