Meyers adds that there are some lighting instruments that just shouldn't be used with digital equipment at this point. "When we were shooting against bluescreen, we learned a lot about how certain lights designed for film translated to digital," Meyers says. "Some of the materials and some of the lighting instruments weren't optimal. We found that the really short-wavelength, blue fluorescent fixtures that are often used for bluescreen work created a blue that the HD cameras weren't that sensitive to. We therefore made modifications to the screens along the way, including using some different fabrics, and we ended up using a mixture of lighting as well. But the digital camera made it pretty easy to see how even the lighting was on the bluescreen; we were able to optimize the contrast by checking a luminance-only version of the blue channel, which gave us a good control for adapting our lighting. Sometimes it was just easier to get a green extraction unless a particular shot definitely called for blue."

Meyers believes that many of these complications will disappear as digital cameras continue to evolve. "We began shooting Episode II with a lot of first-generation equipment, but during the span of postproduction we went through two more generations of cameras," he notes. "The cameras we ended up using for pickups and some of the model work were quieter and had a little bit better ‘signal to noise’ [ratio] than the ones we shot with at first. Also, the lenses we were using by the end of the project were sharper at the edges, and the geometry ended up being flatter."

Until Episode II rolled around, ILM’s postproduction pipeline was configured to scan a film original into the digital realm, marry it with digital visual effects, and then send it back out to film. Episode II’s HD material required a significant retooling of the pipeline, according to Judith Weaver, ILM’s executive producer of visual effects. "Because the film was to be shot in HD, we had to determine how to push that pristine HD footage, which has a totally new look, through our pipeline."

Instead of streamlining the process by disconnecting their film scanners and going right to work on the HD footage, ILM’s post artists had to contend with a number of new issues. "HD is a different medium than scanned, digitized film, so we couldn’t simply disconnect the scanners and go to work," Weaver explains. "It has different visual demands, so we had to treat it accordingly."

The solution was to essentially build a new pipeline, which was tough in light of Episode II’s accelerated schedule. Whereas Episode I was completed in two years, Lucas wanted Episode II to be shot, finished and in theaters in just over a year. "Because ILM didn’t have an HD pipeline, we needed to figure out how to develop and get around in a new system," Weaver says. "For instance, on stage we had to decide whether to shoot directly to a server or to tape. What was the most efficient way to do it with the least loss of quality? We decided to try to shoot directly to the server, but we occasionally had to go to tape simply because of a lack of resources – at one point, we had seven stages shooting."

The use of HD simplified the post process in some ways and complicated it in others, she adds. "It streamlined the process of getting images online because it was online instantaneously. It also allowed us greater freedom in terms of manipulating shots, and we could move things around more quickly." However, ILM artists were accustomed to working with color-corrected footage that only needed to have effects shots added, and on Episode II they found themselves working with the HD equivalent of raw footage. "HD is so unforgiving; you see everything," Weaver notes. "Makeup is a case in point. Queen Amidala’s (Natalie Portman) eyeliner had created a tiny smudge on the white of her eyelid. On film, you never would’ve seen it, but we had to eliminate it. Dealing with details like that, which you just wouldn’t have seen on film, almost doubled our compositing time."

Some Episode II imagery needed no color correction, but when such adjustments were necessary, the post team had to determine at what point in the pipeline they should occur. (Director of photography Tattersall was not involved in any of the post work.) "Decisions such as whether a whole scene needed to be darker or whether the faces needed to come up used to be made in the traditional color timing on the way in," Weaver notes. "We had to decide whether that method was appropriate for HD, and it turned out that it wasn’t. To maintain flexibility, we chose to do our color correction in the composite and in the color timing out the back end."

ILM artists used the Snell & Wilcox’s Piccaso color-correction system to put the finishing touches on Episode II. "The system allowed us to manipulate all the elements, whether CG or live action, to ensure that the images had a consistent richness," Meyers says, "but it’s not fair to say that we used a lot of post technology to take the digital elements where they wouldn't have otherwise gone. We weren't trying to push or exaggerate things, and we used the same system on the film and digital-cinema releases. Our goal was to create the most visually pleasing movie for both film and digital projection.

"The final digital files were interpositives [IPs], but we created all of the printing negatives from the digital intermediate, which was the master," he continues. "Although the digital master is an IP when you look at it that way, it's not a reversal process at that stage, so in that sense we didn't shoot out an IP; we shot out the printing negative from which all the domestic prints were struck. We decided to shoot out seven complete negatives, one for each of the show’s seven reels. When we filmed out we removed two generations in the process, but even one generation away from the digital master was going downhill. [As a result,] the film version is not quite the same as the digitally projected version."

"We had never filmed out 2000-foot loads before," Weaver adds, "so we had to figure out how to do that, as well as how to drop a shot in the middle of a reel while on the filmer if we needed to do that. We started filming out in January [2002] and delivered everything to Deluxe, where they struck the prints. Typically, we have up to our deadline to deliver our effects shots, and then it’s someone else’s job to cut and strike negatives and prints. This time the entire thing was our job."

Meyers dismisses the notion that Episode II looks better digitally projected than it does on film simply because it originated digitally. "I have some concerns about those comments, especially when you consider that so much of Episode I was digital to start with," he says. "Even though it originated mostly on film, Episode I has plenty of digital matte paintings and digital characters. When we did our digital-acquisition tests, we did side-by-side [comparisons] with anamorphic, Super 35, VistaVision and digital and took them all out to film. Shooting digitally, we got a good-looking picture that in many cases was better than many of the film formats. The decision to shoot digitally had nothing to do with digital exhibition, other than that we could be digital from start to finish.

"It's been really exciting throughout this whole process, even from the early development days, to work at a film-based company that has pushed the envelope from early computer graphics to where we are now with digital acquisition and delivery," Meyers concludes. "Looking back at Episode II, we succeeded in all the ways hoped to succeed, and it was certainly a learning process for all of us. The really exciting part is that what we're seeing now is just the beginning." Weaver agrees, adding that "Episode III will be smoother, and the digital part will be a different animal than what we worked with on Episode II. The next time we shoot with HD we’re going to use a whole different generation of cameras, and that will change the pipeline again."

<< previous || next >>

© 2002 American Society of Cinematographers.