(By Ann Hornaday, Washington Post, May 18, 2012)
It’s not a
bug, it’s a feature.” That’s an
oft-heard refrain in the computer world, where programmers routinely trot out “improvements”
that users experience as irritations and glitches. The phrase came to mind a few weeks ago
during CinemaCon, a confab of movie exhibitors in Las Vegas where Warner Bros.
showed them 10 minutes of Peter Jackson’s hotly anticipated adaptation of “The
Hobbit: An Unexpected Journey.” As reports filtered out of Caesars Palace, no
one was talking about Elijah Wood’s Frodo or Martin Freeman’s Bilbo or how the
action and meaning of J.R.R. Tolkien’s epic fantasy had translated to the
screen. Rather, the blogs were agog with news about a new 3-D digital format
Jackson used to photograph “The Hobbit,” at a souped-up 48 frames per second —
twice as fast as the usual 24 frames per second of conventional film.
One
exhibitor present reportedly compared the look to a behind-the-scenes
featurette; Variety reporter Josh L. Dickey wrote that the new format lacked
the “cinematic glow of industry-standard 24 fps.” Although computer-generated
characters were a “distinct presence,” he continued, “human actors seemed
overlit and amplified in a way that many compared to modern sports broadcasts
. . . and daytime television.”
But at least
one film-lover in Vegas liked what he saw. The “Hobbit” footage, wrote online
film columnist Jeffrey Wells on his Web site, Hollywood Elsewhere, was “like
watching super high-def video, or without that filtered, painterly,
brushstroke-y, looking-through-a-window feeling that feature films have
delivered since forever.” The high frame rate, he continued, “removed the
artistic scrim or membrane that separates the audience from the performers.” In rhapsodizing about the heightened realism
and sharpness of the “Hobbit” footage, Wells pointed out an aesthetic trend
that, to many viewers raised on the grain and texture of film, looks like a bug
is well on its way to becoming a feature.
Video
initially presented a threat to film studios, before they learned to leverage
it, both in production and as an added revenue stream. In time, it seemed as
though video might even save the cinematic medium itself, financially and
creatively. In the documentary “Hearts of Darkness,” Francis Ford Coppola
famously predicted that one day “some little fat girl in Ohio is going to be
the new Mozart and make a beautiful film” with a video recorder. “The so-called
professionalism of movies will be destroyed forever,” Coppola enthused. “And it
will really become an art form.” Art
form or not, videotape clearly held equal attraction for big-fish studios and
tiny-minnow indies, for the same reason: It’s cheap and easy. Forget the fat
girl in Ohio — it was the fat cats in Hollywood who saw video as one way to
keep spiraling costs down. Once it was digitized — and video was supplanted by
computer technology — its aesthetic potential increased exponentially.
Digital
image capture started gaining crucial toeholds in the 1990s, when respected
cinematographers and directors began to embrace it. It was no surprise that
such early technological adopters as George Lucas and James Cameron began
evangelizing for digital. But cineastes took more serious note when in 2000 the
revered Roger Deakins pioneered the use of the digital intermediate process —
whereby a film is finished in digital form before going out to theaters — on no
less than the tea-soaked, Depression-era throwback “O Brother, Where Art
Thou?.” In 2007, David Fincher proved
digitally recorded images could convey authentic period mood and intensity in
his 1970s thriller “Zodiac”; that same year, “Paranormal Activity” digitized
the video revolution that began eight years earlier with “The Blair Witch
Project,” which ushered in a new era of “found footage” horror films, marked by
blurry, blippy, buggy images that intentionally looked cadged on the fly rather
than composed. In 2009, “Slumdog Millionaire” became the first film
photographed entirely digitally to win a best cinematography Oscar, and
“Avatar” changed the 3-D digital-image game forever.
But 2009 was
also the year that Michael Mann — known as a bold visual stylist from his work
on “Miami Vice” and “Heat” — released “Public Enemies,” about Depression-era
gangster John Dillinger. Filmed on high-definition video, “Public Enemies”
looked cheap and cheesy, especially in its nighttime action scenes, which
possessed the wavy, floaty quality most often associated with daytime
television. Mann’s stylized signature seemed to have given way to an
anachronistic, pixelated sharpness that took on a smeary quaver when Dillinger
and his men were in motion . A year later and several genres away, the
romantic-action-comedy “Date Night” suffered from the same problem, with the
antics of Tina Fey and Steve Carell often looking as though they had been
filmed for a made-for-TV movie rather than a $50 million Hollywood production.
If this was “lifelike” HD, give me painterly brush strokes — also known as
depth, texture, warmth and translucence — any day.
Still, as
disquieting as those misfires were, digital seemed to take a giant leap forward
last year, when such films as “Drive,” “Melancholia,” Fincher’s “Girl With the
Dragon Tattoo” and the Navy SEALs thriller “Act of Valor” found new saturation,
expressiveness, range and precision in digital cinematography. Explaining how
he achieved the bright palette and vivid lines in “Drive,” director Nicolas
Winding Refn first mentioned getting a really good camera (in this case the
Alexa, manufactured by nearly 100-year-old German camera company Arri) and
hiring a great digital colorist, who can shape and deepen a film’s look by
adjusting color just as lighting electricians contour shadow and light for the
same purposes. “The grader is the new gaffer,” Refn quipped, referring to color
and light technicians.)
Directors
Scott Waugh and Mike “Mouse” McCoy used a Canon 5D digital camera to film “Act
of Valor,” much of which was caught in real time on actual SEAL training
maneuvers. But there’s some 35 millimeter film in “Act of Valor” as well, and
the two were adamant that the video portions look just as rich and textured as
the film stock. “Shane has a really good word for it, which is plastic,” Waugh
said of cinematographer Shane Hurlbut, describing how video looks at its worst.
McCoy added that they fitted the digital cameras with old Leica lenses, which
added texture to the images. “They had this 1980s Leica glass that just had
this subtlety to it,” he recalled. “All of a sudden we’d found the secret
sauce.” When “Act of Valor” was in post-production, the filmmakers took care to
turn any “noise” — a static-like pattern that’s a common artifact of filming on
video — and into film grain. “It’ll be 35 [millimeter] to 5D in the same scene,
and it’s seamless,” McCoy said.
Such
directors as Refn, Fincher, Waugh and McCoy are proving that, like all
expressive tools, digital is as good as the artists who use it. But even the
advances of 2011 didn’t convince film’s high-profile holdouts. Shortly after
showing his new film “Moonrise Kingdom” to the press at the Cannes Film
Festival on Wednesday, Wes Anderson — who shot the coming-of-age love story in
Technicolor, on 16 millimeter film — noted ruefully that this might be his last
celluloid endeavor. Anderson recalled
being interviewed by a magazine that regularly publishes grids of what
equipment various directors used on their upcoming projects, including the film
format. “Every single one of them said HD except ours,” he said. “But I think
in two years ours will have to say HD, too. I think this option is
disappearing. . . . I don't know. Maybe there’s a great app that can make
[digital] look like film, but in my opinion there’s really no substitute.” With
Kodak — one of the chief purveyors of film stock — in bankruptcy and processors
like Technicolor on the ropes, it’s unclear whether raw film stock will even be
an option for much longer.
In December,
Christopher Nolan — he of “Batman Begins,” “The Dark Knight” and “Inception”
fame — convened some of Hollywood’s most high-profile directors to watch some
advance footage of “The Dark Knight Rises.”
He then told them why he’d really called. “The message I wanted to put
out there was that no one is taking anyone’s digital cameras away,” Nolan told
DGA Quarterly, which covers the Directors Guild of America, in its spring
issue. “But if we want film to continue as an option, and someone is working on
a big studio movie with the resources and the power to insist [on] film, they
should say so. I felt if I didn’t say anything, and then we started to lose
that option, it would be a shame. When I look at a digitally acquired and
projected image, it looks inferior against an original negative anamorphic
print or an Imax one.” (Nolan is just one of many filmmakers who will be seen
debating the digital-vs.-film issue in “Side by Side,” a documentary produced
by Keanu Reeves that comes out this summer. But it should be noted that he
wasn’t above including some snippets of HD video in “Inception.”)
Nolan,
Anderson and many of their peers have recently spoken out about feeling
pressured by studios to make their films digitally, which is part of a recent
push to have every movie theater in the country — not just chains but
independents — convert to digital projection. The expensive enterprise is by
now almost complete; the average American filmgoer would be hard-pressed to see
a movie projected on film today, even if it was originally made on celluloid.
(Some predictions have 35 mm film disappearing almost entirely from theaters by
2015.) On one hand, digital projection
is good news, banishing forever the problem of prints that would tear and
scratch just a few weeks into their runs, not to mention the scourge known as
“under-lamping,” wherein theaters would save money by using low-watt light
bulbs in their projectors, resulting in otherwise sparkling movies being
reduced to vats of pea soup. (I was once with Oliver Stone before a screening
of one of his films in Austin, and he fled the theater just as the lights went
down. He knew what kind of visually compromised experience was coming, he
explained, and he couldn’t bear to watch.)
But it turns
out that digital projection doesn’t guarantee a pristine viewing experience —
far from it. At the Maryland Film Festival this month, University of Wisconsin
Cinematheque programming director Jim Healy recalled seeing the visually
stunning “We Need To Talk About Kevin” on film; when he went back a second
time, the theater was showing it on a high-definition Blu-ray disc (an
increasingly common practice), but through a non-HD system. The result was
dark, dingy and virtually unintelligible, and Healy left after a few minutes.
“It just wasn’t the same film.” And don’t get him started on how many times
he’s seen a 2-D movie projected on a screen meant for 3-D films, an
all-too-common occurrence that results in yet more dark, dingy images.
The
digitization of the theatrical experience — whereby audiences are basically
watching a DVD on a really big screen — raises more troubling questions for
Healy and his fellow exhibitors. “We’re now looking at generations of movie
viewers who, if they’re going to cinemas at all, will see stuff digitally
projected, and it won’t be too different from what they’re seeing at home,” he
says. “That concerns me, because if there’s less and less of a difference, then
what’s the reason to keep going out?” And
it’s not just a question of visiting the corner bijou to find fellowship at the
altar of celluloid and sprocket holes. As the distinction between film and
digitally captured images disappears, so does the notion of the cinematic
medium itself. When big-screen movies are made to be seen on iPhones, and the
next big holiday-season epic looks more like an NFL playoff game than a
composition of discrete and expressive formal properties, our aesthetic
expectations aren’t just evolving but eroding.
As the cinematographer John Bailey said to me last year, what’s at stake
in these questions isn’t just the evolution of standards or generational
tastes, but the very notion of cultural consensus over what the term “film”
means. It’s the question of whether a bug should become a feature, and what
might be irrevocably lost in the metamorphosis.
No comments:
Post a Comment