Curtis Clark, ASC
A Controversial Beginning
As anyone involved with feature film and/or TV production knows,
cinematography has recently been experiencing acceleration in the
routine use of digital motion picture cameras as viable alternatives
to shooting with film. The beginning of this transitional process
started with George Lucas in April 2000.
When Lucas received the first 24p Sony F900 HD camera to shoot
Star Wars Episode II, Attack of the Clones, cinematography was introduced to what was the beginning of perhaps the most disruptive motion imaging technology in the history of motion picture production.
The initial marketing hype that accompanied 24p 8-bit 4:2:2 HD video claiming "film is dead” effectively and prematurely undermined
any potential that digital cinematography might have had back then
for establishing an early beach head toward industry acceptance.
What was missing at that time was a clear understanding of the
requirements for a ‘digital motion picture camera’ that would be
able to go beyond the imaging constraints of broadcast TV-based
HD toward the image capture capabilities of motion picture film.
The majority of filmmakers recognized that the claim of imaging
parity with 35mm film capture was preposterous. It wasn’t primarily the limitation of its 1920 x 1080 spatial resolution, but rather
its limited dynamic range and color bit depth within the restrictive
Rec.709 gamut, along with the camera’s 2/3 in. sensor size which
significantly altered depth of field and necessitated using lenses designed for the world of HD video. Establishing a convincing "film
look” was generally elusive and the prevailing non-DI (Digital Intermediate) post workflows of that time presented further challenges with film-outs for theatrical distribution. Also, it’s worth remembering that, at that time, Digital Cinema was not yet on the horizon.