On the evening of May 23, 2018, members of the ASC Motion Imaging Technology Council (MITC) met at the Clubhouse in Hollywood to hear the reports and updates from its committees, as well as see a presentation from Radiant Images, Yi Technology and Google Jump on virtual reality.
MITC chair Curtis Clark, ASC reported that the 2018 ASC Motion Imaging Technology Council Progress Report for the SMPTE Motion Imaging Journal, the 11thsuch report, is nearly complete, and thanked MITC Secretary David Reisner for collating and editing and the MITC committees for their “excellent contributions.”
The ACES Next development team from AMPAS, headed by Annie Chang, Joachim “JZ” Zell and Rod Bogart, has been on a “listening tour” to get feedback from ACES 1.0 users and DITs. Chang added that a survey will go out soon to ASC members, and that she’s also getting feedback from colorists. She also noted that creating ACES Next will be an open, community effort rather than one that takes place inside the Academy.
With regard to LED Lighting, George Joblove spoke about the spectral similarity index developed as a proposed color-rendering index. He and Dan Sherlock did some analysis that showed “skin tones work as well as others for SSI,” which is “intended as a confidence indicator to show predictability with a solid state lighting source.” “A low score doesn’t necessarily mean you’ll get bad results, but testing is in order,” said Joblove. “We want to start evangelizing with lighting vendors to make sure they fully understand what SSI is and why it’s a good thing.” He noted that, “good contacts at lighting manufacturers is always appreciated.”
With Red representative Dan Duran in the audience, Bill Bennett, ASC — vice-chair of the Camera Committee — asked him to explain recent changes in the company product nomenclature. Duran reported that, to simplify, the company has coalesced around three camera names: the 5K Gemini, 8K Helium and 8K/60fps Monster.
Camera Committee chair Dave Stump, ASC — along with Bennett and Richard Edlund, ASC, who is also a vice-chair of the Camera Committee — noted that they’re working on a project (to be presented at Cine Gear Expo) testing the pipeline from on-set to VFX and compositing, to prove the value of lens metadata. “The point we try to make is if VFX spends four or five days reverse engineering what’s happening in the shot, having lens metadata can narrow that down to one day, because the processes are automated,” said Stump.
Lens Committee co-chair Jay Holben gave the Lens Committee report, noting that it now has three subcommittees: filter classification, focus accuracy and lens metadata. Joining forces with Ira Tiffen, the first subcommittee plans to classify lens diffusion filters by attributes in halation, contrast and resolution. Howard Preston is leading the second committee, with the aim of looking at a more refined system of lens marking and holding manufacturers accountable for their accuracy.
The Motion Imaging Workflow committee, with chair Greg Ciaccio and co-chairs Tim Kang and Chris Clark, just had its first meeting recently. “Annie’s [ACES] group is doing the heavy lifting,” said Ciaccio. “We’re working on education and trying to make the workflow simpler and more focused. We want to examine how people are really using ACES and generate best practices.” Kang said he is exploring efforts to start a working group or subcommittee, perhaps piggy backing on SSIs, to figure out saturated color exposure. Anyone interested in the topic should contact Kang. DI committee co-chair Josh Pines had no report from the DI committee, but was challenged by Bob Kisor on whether the term “Digital Intermediate” still made sense when the capture medium and output were both digital. “Yes,” said Pines. “What we produce is still an intermediate, because it doesn’t have final transforms baked in. We want to make sure it is useful for the future, so the name still works.”
The Advanced Imaging committee, made up of chair Gary Demos, along with vice-chairs Jim Fancher, Bill Mandel, Joe Kane and David Reisner, checked in with Demos via speakerphone. He reported the committee is writing up a “laundry list” to shoot to test parameters with a Sony Venice camera. He also reported that, at Mt. Palomar, a 24K x 24K resolution telescope spans the entire sky; he and Stump are trying to arrange a joint meeting there, with the experts giving a presentation.
The VR capture and stitching presentation was truncated because several of those presenting had to leave to catch a plane, but many attendees had a chance to put on a VR headset prior to the beginning of the evening’s meeting. The set-up showed a Yi Halo 360 VR camera with Google Jump Cloud’s AI-assisted automated stitching workflow.
VR committee chair David Morin delivered the report from the committee on Virtual Reality, which also counts Michael Goi, ASC and Mike Sanders as vice chairs. Morin said that Goi is working on a short VR piece that he hopes will be shown to the group in the future. He also noted that we’ll soon see VR captures with 6DOF (degrees of freedom), which will move us towards a more dimensional VR as well as a real-time VFX workflow and a closer connection with Virtual Production, another committee Chaired by Morin with co-chair John Scheele. Morin reported that, on Ready Player One, director Steven Spielberg used VR goggles to create shots in previsualization, which is becoming a trend. “Previs used to be an over-the-shoulder process where the director told the Maya artist what to do,” said Morin. “Now the director, with his own controller, can shoot with virtual cameras.”
Morin also pointed out the rising use of game engines in the production of motion pictures, showing real-time rendering from Epic Games. “Here we have the development of real-time ray tracing – cinematic lighting – with Nvidia and ILMxLab with Epic Games, based on Star Wars content,” he said. The technology will be available soon in the Epic Games engine. “It’s bringing in a new era of previs, to design shots and get closer to the final image,” he said.
International Cinematographer Guild’s Michael Chamliss added that he spent a day on the set of James Cameron’s Avatar 2, which is being photographed by Russell Carpenter, ASC. “They’re working in a blended VR environment,” he said. “They’re taking the mocap to turn them into characters in that space and then choosing their shots. Production has discovered there’s true value in having that camera in the hands of a cinematographer or camera operator.”
Relevant to the Professional Monitors committee, Jim Houston noted that there is a draft standard for a P3 Display that Apple created. The graphics industry doesn’t have monitors that let them develop HDR VFX, added UHDTV committee Chair Don Eklund, who said, “they are trying, but it’s not very pretty.” Back-of-the-napkin specs, he suggested were for 100 percent of P3 and 1000 nit peak at a sub $5, 000 price. He said that, “response from display industry is better than he had expected.” NGCD committee Co-chair Eric Rodli, giving the Next Generation Cinema Display report, brought up the industry’s interest in HDR StEM material. The committee has been a bit dormant, Rodli said, because “we weren’t sure where the change was coming from.” “We want to take advantage of new technologies, but we also want to preserve creative intent,” he said. “That’s our dual charter.”
The UHDTV Committee report featured Chang and Mike Zink’s report on the UHD Alliance. “From a UHDA perspective, we’re trying to prepare premium parameters for consumers TVs,” said Zink. “One thing that’s been tricky is to maintain the creative intent in the home. Many CE manufacturers have finally heard the creative community so there’s enough interest to fix a lot of these problems.” Chang is also chairing a subgroup to get more reach into the creative community, which has indicated the desire for a reference mode. The subgroup has to discuss such factors as whether it should be easy to get into reference mode (and, if so, what that means) and then bring those requirements back to UHDA. Eklund noted that it was “a group effort” to arrive at 2000 nits for peak brightness. He suggested people experiment with the Apple TV box, and get HDR TV sets for their home to do the same.
Thomas Wall from the SSIMWave Evaluation working group reported that the group started to create HDR testing material. He acknowledged Samsung, which provided an Onyx LED Emissive display, and Roundabout West, which helped with color grading; so far, Stump, Clark, Bennett and Steve Poster, ASC have graded material on the display. Lenovo account executive for media & entertainment Pierre Couture is also helping to put together a test bed. “We discovered interesting things having to do with ACES, so Annie and JZ came aboard,” he added. “We’ve had a fairly extensive look at this new technology, and anyone who has seen it can say it’s stunning and a game changer.”
Pete Ludé from the Computational Cinematography committee, which also includes Reisner and Pines, reported that the group is stymied over its name (“Plenoptic” wasn’t a hit). But, he said, “We may need a Plenoptic Decision List to help put artistic intent in the right place.” He is convening a light field workshop under the auspices of the VFX Society, SMPTE and ASC on June 7 in San Francisco that will include breakouts for cameras, displays, post production and interchange standards. Held at Adobe, the event is invitation only. This committee’s next meeting will take place at the end of summer, and he encouraged all to attend.
Michael Friend, co-chair of the Restoration & Preservation committee, reported that his group, which includes committee chair Grover Crisp, is “capturing master renders for our archive,” and that he is talking with VR and plenoptic people about preservation requirements for their content. “Nobody has a good answer,” he said. “Even how to find the material after production is done is a challenge.”