Bill Bennett ASC captured the short film Flamenco using high-frame-rate techniques to showcase the new post tool’s possibilities.
RealD, known primarily for stereoscopic polarizer systems used by theater projectors, as well as the ubiquitous Men-In-Black-style glasses found in 3D multiplexes, recently asked Bill Bennett, ASC to showcase the possibilities of the company’s fairly new approach to anti-aliasing and frame-rate conversion. Bennett’s short film Flamenco, directed by Demetri Portelli, stereo supervisor on the 3D translation of Martin Scorsese’s Hugo, Jean Pierre-Jeunet’s The Young and Prodigious T.S. Spivet, and Ang Lee’s Billy Lynn’s Long Halftime Walk, was captured using dual Arri Alexa XT cameras in a Cameron Pace 3D camera rig.
Bennett employed Arri Master Primes as well as Arri Master Anamorphic lenses, which they say was a first for 3D capture. An earlier version of RealD’s TrueMotion processing system was employed by cinematographer John Toll, ASC on Billy Lynn’s, released in 2016. RealD screened footage from a combat sequence seen in the picture, which was wholly immersive and impressively sharp as a 3D experience.
“Since the beginning of cinema, people have been trying to figure out how to keep the wagon wheels from appearing to go backwards,” Bennett explained before introducing Tony Davis, architect of TrueMotion and Senior Scientist at RealD. “He’s figured out how to do that. He has figured out how to eliminate strobing. The technique is simple for a cinematographer, but then the process that he created has complex mathematics.
“You shoot 120 frames per second, 360 shutter angle, and then you run the files through his tool, and you get a 24-frame-per-second result,” continued Bennett. “You get to pick the shutter angle during postproduction. You can have any shutter angle you want. Even shutter angles greater than 360, if you so desire. You also get to have different attack and decay.
“The advantage is that anything that is moving fast has a soft edge to its blur, and things that are stationary in the frame, are not affected. This is pretty significant as we're moving into high dynamic range, because as contrast goes up, you're seeing strobing in objects moving across the screen that is exaggerated.”
With cranking and ramping options in the RealD TrueMotion software, frame rates do not have to be homogenous and are infinitely adjustable after capture without any detriment to footage metadata or exposure fidelity. They can also be applied selectively to areas in a frame, so foreground, including localized elements or actors, can be masked. RealD showed corrected footage of the classic reverse-wagon-wheel effect mentioned by Bennett as an example.
The “synthetic shutter” created by RealD’s TrueMotion software allows adjustments to sharpness, motion blur, and judder after capture. TrueMotion uses five equally-averaged exposures of 120fps footage for each single frame of 24fps output, and it can also be set to weight certain frames for reference. The result is more traditional, cinema-like footage at 24fps, or sharper, more staccato looks can be created without the addition of the resultant judder. Run via Mac, the software will work with ProRes, H.264, ARRIRAW, ACES (EXR), DPX and TIFF workflows. Further options are planned.
For the high frame rate needs of Flamenco and to test the RealD TrueMotion footage processing system, Bennett captured at 96, 120, and 192fps with the dual Alexa XT cameras. He had an additional Alexa Mini to acquire in 2D at 192fps. RealD than used True Motion processing to compare and contrast the variety of frame rates as they were converted to smooth, seamless projection at 24fps.
The company showed the footage on their Ultimate Screen, which they say has 85% more brightness than standard silver screens thanks to a new reflective surface. When a company manufactures the movie screen, you know that you’re in for a very technical presentation on film. RealD Senior Scientist Tony Davis did not disappoint.
“For 100 years, we’ve enjoyed movies, and it’s been great,” explained Davis about the need for a new system of shutter control. “But over those 100 years, we’ve been adding on to our house. We’ve been adding high frame rates. We’ve been adding 4K. We’ve been adding stereo imaging, which we’re very happy about. We’ve been adding high dynamic range. And that’s putting increasing stress on our foundation, and we’re now starting to see judder.”
“There's nothing wrong about a 180-degree shutter,” Davis continues. “It works beautiful and it has worked beautiful for 100 years. What we have not had were new controls that could make it look anything different. We always had judder in some proportion, and if we wanted to have more or less judder in certain ways, we didn't have controls for that. We basically had shutter open or shutter closed. That's what we're now doing. What happens in shutter phase, we adjust the attack and decay of the shutter waveform.”
“We acquire footage at a much higher frame rate than we are interested in ultimately delivering, very high frame rates, and we do that with a 360-degree shutter, so at all times we are capturing everything within that frame-time… We run it through True Motion, and we drop it down to a lower frame rate. We use all of that frame data, and we can mathematically put it together to synthesize these shutter waveforms that the camera could not do. If we need multiple output frame rates, we can just do it again!”
“I used to work on medical CAT scanners!” laughs Davis when asked about the genesis of TrueMotion. “A lot of the mathematics that apply to cinema cameras, apply to CAT scanners. We actually got into this whole thing because I started using cinema examples to explain the mathematics of CAT scanners to my colleagues. I would use Ray Harryhausen [stop-motion animation] footage, of the skeletons fighting [in Jason and the Argonauts], to show what happens with a zero degree shutter, essentially. Because that's what that is! Stop motion is a zero-degree shutter. It has no motion for it at all. Then I showed [ILM] “go motion” footage from The Empire Strikes Back, where they would actually wiggle the puppets a little bit during the shot to give it motion blur. I kept saying that motion blur is really important.”
Davis explains that judder is a camera problem, while strobing is a projection problem. With the camera on one end of production, and the projection on the other, the goal of the True Motion system is to take the possibilities of continuous motion capture, break them into frames, properly sample the image, and then reconstruct it without loss to fidelity. Thanks to a digital control over the exposure ramping between frames, they say the process not only eliminates judder and strobing, but these effects can be controlled, localized, and even added or enhanced as a creative choice.