Film's not dead, damn it!

Interviews with some of today's leading cinematographers -- the real magic-makers of the movies -- suggest that George Lucas' overhyped "digital revolution" is mostly marketing buzz.

Published July 3, 2003 8:00PM (EDT)

The way we see movies is about to change -- tomorrow. Or maybe the day after that. Then again, it might not happen until sometime next week. But according to what we've been told -- by the media, by some filmmakers and, perhaps most significantly, by the people who actually manufacture the necessary equipment -- we do know for sure that digital technology is poised to revolutionize the moviegoing experience.

But not many people have asked the essential question: How are these movies going to look?

The best people to ask are the ones who have the most at stake, the people who have built careers and reputations on knowing what it takes to make a movie look just so. Cinematographers are at the vanguard of the changing technology; many of them are familiarizing themselves quite rapidly, and happily, with digital editing processes, for one thing, particularly in the case of movies that feature lots of special effects.

But last summer a controversial, and not exactly astutely researched, Los Angeles Times article depicted contemporary cinematographers as a bunch of aged Luddites quaking in their boots as they face an onslaught of bright youngsters brandishing fancy new digital cameras. At the center of the article was "Star Wars" emperor George Lucas, who had invited a group of big-name directors -- among them Francis Ford Coppola, Robert Zemeckis, Oliver Stone and Steven Spielberg -- to his private screening room to sell them on the wonders of digital technology.

The article was essentially just another version of the "Film is dead" rallying cry -- a triumphant shout that's been around for so long now that it's more like a feeble cough. Film isn't dead, although it is of course changing, and changing fast. But the Los Angeles Times article, and others like it, suggested that today's cinematographers are nervous about those changes, when in fact, they'd be the first to acknowledge that staying on top of them is part of their job. How many photographers, of any stripe, do you know who don't jump at the chance to fool around with new equipment? The cinematographer's artistry depends on knowing what tools to use -- digital or otherwise -- and when to use them.

Steven Poster, a former president of the American Society of Cinematographers, calls it a kind of alchemy. "It's what we do, the magic of deciding, 'I'm going to use this kind of film stock for this, or this kind of digital camera, or this kind of technology or technique. I'm gonna use these lights, I'm gonna make it look like this.' We have to stay abreast of these developments at all times. There's constant learning within the field, of knowing what our tools are capable of."

The problem isn't that cinematographers don't like digital technology; it's simply that they know what its current limitations are. There has been plenty of hype surrounding digital technology as it's been used in filmmaking. And there are certainly pictures, like the first two entries in the "Lord of the Rings" trilogy, that wouldn't look half as beautiful as they do without the use of that technology.

But remarkably few people have bothered to ask cinematographers -- the people who should know best -- what the technology's current strengths and limitations are. People like George Lucas like to think they're on the vanguard of these new methods and modes of filmmaking. But it probably hasn't occurred to most moviegoers that the "Film is dead" movement may be more strongly driven by forces in the marketplace than by artistic considerations.

In other words, there are some big corporations that would like us to think that digital filmmaking is ready for prime time. And cinematographers may be the last line of defense between those massive marketing forces and the rich visual heritage of movies. Even though most moviegoers think they know what cinematographers do, it's likely they don't even know the half of it. But in the rapidly changing world of filmmaking, their role as preservationists -- as preservationists of the quality and vitality of images -- is more important than ever.

When Terri Gross interviewed cinematographer Gordon Willis for her National Public Radio program "Fresh Air" last fall, she introduced him by playing a few clips from movies he'd worked on: Marlon Brando's opening scene from "The Godfather"; an exchange between Woody Allen and Diane Keaton in "Annie Hall" that captures the sparks flying between their his-and-hers non sequiturs.

You'd think that audio clips would be the worst window into the work of a cinematographer. The surprise lies in how vividly we can see those scenes just by hearing them. You can't hear Brando's voice in that clip without picturing the hollows of his eyes, both as empty and as full as undersea caves. And the mere sound of Keaton's and Allen's tentative chatter resurrects visions of a muted '70s New York skyline cloaked in smog and romance -- we can't distinguish one from the other, which is precisely the point, their commingled beauty is so intense.

You can't take the measure of a cinematographer's work just by listening to it. But if a sound clip can serve as a miniature testimony to the resonance of a cinematographer's images, it can also heighten the ways in which moviegoers sometimes take the cinematographer's job for granted. In one sense, cinematography should be invisible, since it exists mainly to serve the story that's being told. But movies don't shoot themselves. Unless a movie features lots of pretty natural scenery, moviegoers -- and sometimes even people who are themselves involved in the making of movies -- don't always recognize how much thought and care goes into making the kinds of images you can actually hear on the radio.

"Even within the enlightened community of fellow filmmakers who are not cinematographers," says cinematographer John Bailey, "there has been this confusion or misperception of cinematography as pretty pictures." Bailey's 30-year career has included numerous collaborations with the director Paul Schrader, among them "Cat People," "Mishima" and 1999's lovely but little-seen "Forever Mine." Bailey says viewers tend to be most easily impressed by the prettified Ivory-Merchant aesthetic: "I'm not picking on Ivory-Merchant, but they're kind of a shorthand example -- in other words, period films, beautiful costumes, lush landscapes and impressive exterior photography."

But, Bailey says, that doesn't necessarily have anything to do with cinematography, other than the fact that you put a camera there and capture the whole thing on film. "I think a cinematographer's foremost requirement is to use all of the visual aesthetic skill that he or she has to find a style," he says, "a combination of aesthetic and technique to enhance, enlighten and expand the dramatic, emotional and narrative momentum of the screenplay. In the same way that the screenwriter uses words to tell the story, and the director uses the performances of the actors to reveal the subtext and the nuance of it, the cinematographer uses all of the tools that he or she has, focused through the lens of the camera, to reveal and enhance and expand the story."

In other words, it's a job that requires an unusual amalgam of technical prowess and visual artistry -- in addition to management and diplomacy skills, not to mention knowledge of equipment, lighting, film stock and postproduction processes. That may or may not explain why directors of photography don't surface much in media coverage of new pictures. Actors, directors and screenwriters are interviewed all the time, but no one ever thinks to talk to cinematographers, maybe because they're perceived as eggheads who just want to talk about lenses and film speeds.

When you sit down and actually talk to one, you realize that cinematographers mostly just want to talk about movies -- not just about the techniques used in making them, but also about the ways their visual textures and moods can affect us so deeply and so mysteriously. You can get a sense of that simply by watching Arnold Glassman, Todd McCarthy and Stuart Samuel's superb 1992 documentary "Visions of Light," a beautifully detailed thumbnail history of cinematography. "Visions of Light" isn't noteworthy so much because it explains precisely what cinematographers do (you'd probably need 10 two-hour documentaries for that) but because it captures so perfectly what the legacy of their profession means to them.

The cinematographers featured include just about everyone's favorites: Conrad Hall ("In Cold Blood"), Vilmos Zsigmond ("McCabe and Mrs. Miller"), Lászlá Kovács ("Shampoo"), Haskell Wexler ("In the Heat of the Night"), Vittorio Storaro ("The Conformist") and Néstor Almendros ("Days of Heaven"). It's telling that many of them seem more interested in talking about their colleagues' work than their own, and especially about the work of the great directors of photography who came before them -- people like Gregg Toland ("Citizen Kane"), James Wong Howe ("From Here to Eternity") and Russell Metty ("Touch of Evil").

For people who spend so much of their time considering how things are going to look on film, cinematographers seem surprisingly good at talking. "Most of the directors of photography I know and associate with are more like Renaissance men than people from any other parts of the business that I know," says Steven Poster, whose own credits include two very distinctive-looking recent releases, "Donnie Darko" and "Stuart Little 2." "We need to be able to take a space or a room or a large expanse and create the type of lighting that will indicate a mood and allow the actors to move within that space and be lit at any given spot that will tell the story."

At the same time, cinematographers are also busy being managers. "We're managing relationships, we're managing budgets, we're managing equipment," Poster says. "And we're managing egos of many other people. It's a multifaceted job. What we really are is Tom Sawyers getting people to whitewash our fence, so we can be off doing the art that we love to do. That's our little secret."

We've all been trained to be skeptical of anyone who's involved in the making of Hollywood films who actually uses the A-word. Given the state of the movie industry today, it's easier for most moviegoers to be cynical about Hollywood than charitable toward it. We've convinced ourselves -- and unfortunately, too often the movies themselves have proven us right -- that movies are made by committees whose sole aim is to make money, instead of by people with eyes and ears, brains and hearts.

Cinematographers seem to be the antidote to Hollywood cynicism -- not because they don't have to finesse their share of studio pressure (there are certainly times when they do), or because they claim that every movie they work on is going to be a lasting contribution to the canon (they don't), but because they believe so wholeheartedly in doing the best work they can on each given project.

After a screening of the recent stinker "The Recruit" (shot by Stuart Dryburgh), I remarked to a colleague how good it looked. "It's amazing how much TLC goes into crap," he said, and he's right. Forget plots that don't work or performances that fall flat: Someone still had to figure out the best way to capture the je ne sais quoi of a particular car chase, or to light, say, a bridge in a way that captures the essence of bridge-ness.

Beyond that, flexibility has to be part of the cinematographer's art. Every collaboration between a director and a cinematographer is different; what's more, cinematographers may do two or three pictures a year, working in different styles with different directors. Poster likens the process of making a movie to entering a marriage, complete with a courtship, a honeymoon period, the actual work of the marriage and an eventual breakup. Even if you work with the same director on another picture, the new marriage will have different characteristics.

In writing about movies, most critics consider the director to be the guy or the gal holding the bag -- not necessarily because they believe the director is the sole and exclusive author of the work but because it's a kind of shorthand. If you're trying to describe what action was taken to make a movie move or feel or read as it does, you need a noun to go with your verb, and in most cases, the director is your noun. As a colleague of mine once explained it, the director's vision is the one through which all other visions are filtered, which is as good an explanation as I've ever come across.

I suspect that directors get most of the credit for the success of a picture (or lack thereof) precisely because movies are such a collaborative process. Sometimes it's easy to separate the strands of who deserves blame or praise; but in many cases a great moment on film that we automatically give a director credit for may very well be the result of some sort of communication, spoken or un-, between a director, his or her D.P., a production designer, a costume or makeup person and any of the actors involved -- all of whom are working off a script by one or more screenwriters, who may also be on hand. And don't forget about producers, people whose degree of hands-on involvement in a picture can vary from not much to a whole lot.

It always feels corny to talk about the magic of movies, but cinematographers don't seem at all uncomfortable with the word. "It seems like magic to us, too, to actually do it," Bailey says. He compares looking at a finished film with what a composer like Gustav Mahler might have felt when he finally heard one of his complex, conflicted and infinitely layered symphonies played by actual musicians. "How awesome it must have been! In filmmaking, there are so many dozens, if not hundreds, of people involved in making a film who, in the right environment, where the producers and the directors give them the opportunity to really express themselves and put themselves into it, can create this incredible thing. When you go and see the finished film, it has an existence of its own, somehow beyond you. It's so much different than, say, writing a book or a play."

Meanwhile, film -- the medium in which cinematographers have been working for some 100 years, a medium that in its relatively short history has given most of us more joy and pleasure than we can possibly measure -- is dead. Or at least, people like George Lucas would have us think so.

Every now and then, a major news outlet will run a feature sounding a tinny but trumped-up death knell for film as we know it. Last summer, Los Angeles Times staff writers P.J. Huffstutter and Jon Healey jumped on the bandwagon, detailing the way Lucas had tried to convince his colleagues of the supremacy of digital technology by holding that powwow in his private screening room. He showed them two identical clips from "Monsters, Inc.," one completely electronic (in other words, stored on digital tape and run through a digital projector), the other on a reel of film that had already done four weeks in a local multiplex.

The electronic clip, the story noted, "looked less like a motion picture and more like an open window onto a real world." Compare that with the jiggly, scratched-up image that limped onto the screen via the poor, pathetic stepcousin known as film.

Lucas had gathered his colleagues, ostensibly, to issue a warning: It was time to leave film behind, or get left in the dust. Lucas, after all, had broken some ground of his own with "Star Wars: Episode II -- Attack of the Clones," which was shot entirely with high-definition digital cameras -- that is, a new breed of cameras that record images on videotape instead of 35-millimeter film, but with crisper detail and a wider range of color than video cameras have traditionally been able to capture. "Attack of the Clones" was also edited with digital equipment and, in the relatively few theaters equipped to do so, projected digitally.

So because he'd been able to make a stiff, crummy-looking, overblown faux-epic on a new plaything, Lucas felt completely justified in foretelling the death of film. The L.A. Times article played right into his phony argument, in language that sounds borrowed from that most filmic of news sources, a World War II newsreel: "Lucas' blunt message stands at the center of a schism in Hollywood over the fate of film in the film business. New high-definition video cameras and digital editing equipment challenge the longtime supremacy of film. They are cheaper and more flexible. But they also frighten directors and cinematographers who understand every nuance of film. A creative misstep can tarnish a career, so many of those established in the film industry blanch at the thought of showing their inexperience with the latest technology. A colossal mistake, seen by millions of fans, might reveal that they are passé storytellers -- easily replaced with younger, cheaper and more tech-savvy rivals."

Aside from a quote or two from cinematographers Roger Deakins ("With digital, it's all very businesslike. We're not businessmen. We're artists and magicians") and Emmanuel Lubezki, who shot portions of Michael Mann's "Ali" using a high-definition camera ("This is different from film. Not better or worse but different"), cinematographers were woefully underrepresented in the piece. Considering these would be the guys who'd understand better than anyone the potential advantages, or lack thereof, of digital video, it undoubtedly seemed more convenient to not even bother to ask them.

One of the chief problems with the Times article -- and with Lucas' argument in general -- is that it makes no distinction between the various uses of digital technology. As Poster explains it, "One of the things journalists and the public are confused about is that when you use the term 'digital cinema,' you lump it into one kind of thing, but it's really three things: It's image acquisition, it's postproduction and it's exhibition."

Digital applications are currently most widely used in postproduction, the steps taken at the end of the moviemaking process before the definitive print -- called the answer print -- is struck. That's the stage at which cinematographers color-correct the film, which generally means sitting down with a lab technician and making sure every frame looks the way it's supposed to. "Stuart Little 2," for example, included lots of special effects that had to be added during postproduction. The entire film -- even those portions of it that didn't feature special effects -- was digitized, and Poster used some new digital tools to complete the color correction before transferring the whole thing back to film.

"We edit digitally, we do visual effects digitally, and now we're starting to finish the film digitally," Poster says. "The tools are tremendous, and it's just developing into something that's going to become ubiquitous within the next year or two. Finishing a film digitally will be the norm, not the exception." That's a case of the technology being used to make the process more efficient, but it also, of course, works in the service of maintaining the visual integrity of a film.

Poster is less enthusiastic about digital technology as it has so far been applied in terms of exhibition -- that is, projection. As moviegoers, we've all seen our share of dingy prints at the multiplex: By the time a picture has been shown five or six times a day over a period of several weeks, any print is going to show some wear and tear. No D.P. likes to see that happen to his or her movie. But there are still too many variables involved in digital projection to make it an immediately viable solution, Poster says, no matter how "clean" Lucas' digitized "Monsters" may have looked.

For one thing, there's no worldwide standard for digital exhibition of movies. "Film is a worldwide standard," Poster explains. "You can send a 35-millimeter film to Bangladesh and get it shown." But various different standards are competing in the digital realm, with no single version in a dominant position.

Also, the cost of equipping a theater to project movies digitally is still prohibitive. Poster puts it at around $150,000 per screen, and because the technology is changing so rapidly, the equipment could become obsolete in as few as five years. (Whereas a regular old motion picture projector costs around $30,000 and might last 20 years.) And how will people be trained to maintain digital-projection equipment and play digital movies so they look as good as they should, when most of the big movie chains have done away with most of their union projectionists? "It's a much more complex technology than we're ready to deploy," Poster says.

On a more fundamental level, Poster also says we don't really know how images shown on film, as opposed to those captured or projected digitally, affect audiences on a subconscious level. He wonders if maybe there isn't "a perceptual quality to motion pictures that exists maybe because of the flaws of motion picture film." The very slight "jiggle and weave" of film, as opposed to the much-touted steadiness of digital images, may have something to do with why we respond to movies as we do. "There's the granularity of film, which changes on every frame. There are all these perceptual components, which create a hot medium for the audience. It engages the audience in a way."

The point isn't that Poster and his colleagues are resistant to digital technology. They want to make sure any new technology they adopt is better than what they've already got, in subtle ways as well as obvious ones.

Today's cinematographers see what's coming in terms of technology and equipment. When it's good enough for them to use, they say they'll be ready. We're still in the infancy of "high-definition technology" in the movie business, Poster explains. "But high-definition technology, which has been said to be the death of film, the be-all and end-all, is in a rudimentary form that is rapidly changing. So this technology that was supposed to replace film is, within the next year, going to be the old technology."

It's crucial to note, though, that the term "digital technology" doesn't mean much all by itself. Cinematographer Wally Pfister's credits include "Memento" and "Laurel Canyon," but because he began his career as a news cameraman in the early '80s, he knew how to shoot on videotape long before he ever shot a frame of 35-millimeter film. As much as he loves working with film, he's convinced that within 15 to 20 years, electronic media will replace it. But for now, he says the quality of digital images is nowhere near that of images recorded on film, in terms of resolution, richness or subtlety.

Pfister believes that large electronics corporations are using the term "digital" to sell the idea of something revolutionary and hot, even though this "new" technology, at least at this point, is no improvement on the old one. Sony and Panasonic both manufacture high-definition cameras, and have a stake in getting their products used and accepted (not to mention plugged by Lucas), whether they produce satisfactory results or not. It's easy to see how a multinational entertainment conglomerate like Sony -- which owns Columbia Pictures -- would benefit if its equipment became the standard among filmmakers.

"The buzzword is 'digital,'" Pfister says. "It's the same buzzword that's used in the consumer world -- the same word that was used to sell CDs and DVDs and anything for home computers. But it's not an accurate way of describing it. The images are collected and processed digitally, but really, it's videotape. It's a video camera, and the images are recorded onto a video chip." Yet the companies that make this gear, he notes, are trying to act like they've invented "a whole new device."

In fact, Pfister is enthusiastic about the fact that digital technology could help democratize the world of filmmaking. "Anyone who wants to tell a story can afford to," he says, "by picking up a camera for $1,000 and buying computer software for another $1,000. That basically allows them to write, produce and edit films entirely on their own." Pfister, Poster and Bailey all note that there are instances where the new high-definition cameras can be put to good use, particularly in episodic television -- the image quality looks just fine on TV.

The problem, Pfister says, is that digital technology is being pushed for theatrical exhibition purposes even though it's still an inferior medium. Manufacturers of digital filmmaking equipment are hoping to take advantage of the fact that the technology is changing so rapidly -- today's top-of-the-line high-definition camera is sure to be tomorrow's garage-sale Brownie. Obsolescence guarantees a steady revenue stream, as most of us know from having to replace our computers every other year.

At the moment, for the serious cinematographer's craft, Pfister says, high-definition cameras are simply not the best option. "It's like asking us to work with crayons rather than oil paints. You can do some incredible works of art with crayons. But with the current videotape format, you're never going to capture the textures, the depth, the richness, be able to see into the absolute subtleties and the shadows, that you can with 35-millimeter film.

"Most cinematographers, we have a great passion for the artistry involved in our jobs. And what we love the most is painting with light. If somebody tries to tell us that the paintbrushes that they have are better than the ones that we're using, we're gong to be leery. And we're going to be the ones to decide whether those paintbrushes or that particular paint is as good as it should be. Because we know it better than anybody else does."

- - - - - - - - - - - -

John Bailey says that a few years ago at the Sundance Festival, when one of the "film is dead" factions of young directors was making itself rather noisily heard, he decided to shoot something on digital video, just to find out what it was like. He thought he might end up doing a small project, a 10- or 15-minute student-type film. Then Jennifer Jason Leigh approached him to shoot the feature she was making with Alan Cumming, "The Anniversary Party."

"So I decided to really embrace the D.V. aesthetic and make it as much like a film looks as I really could, given the technology and the equipment," Bailey says.

In retrospect, Bailey says, both he and Leigh realize there was no reason not to have shot "The Anniversary Party" on 16-millimeter film and then blown it up to 35-millimeter, given the time, effort and cost of doing the project digitally. But the experience of making "The Anniversary Party" helped him define some of the difference between film and video. He realized that there were some independent features -- he cites Neil Berger's "Interview With the Assassin" -- that have really benefited from embracing new technology.

Digital technology and celluloid technology have also converged in recent years, he says, "to create images and propel stories in a way that was impossible even six or seven years ago." We're seeing more and more images that are being created completely through computer-graphic technology -- images that couldn't possibly exist in the real world.

"This kind of incredibly rich, complex and highly artistic image creation on computers becomes sort of a new norm and expectation, sort of an image default position," Bailey says. "It has the possibility to desensitize our ability to look at beautifully captured, real, natural images.

"I'm not trying to pick on any particular film, but I'm thinking of a film like 'The Two Towers,' or even the first one ['The Fellowship of the Ring'], where there's just so much incredible stuff that's created only on computers. There are now studios that are thinking about going and shooting in New Zealand. Admittedly, New Zealand is incredibly beautiful, and those movies had some wonderful natural landscapes. But those studios are going there looking to capture images that were captured only on computers."

In other words, they want to shoot on location in Middle Earth.

"It's a very exhilarating time and also a very anxiety-inducing time for all filmmakers and especially for cinematographers," Bailey says. "I've been joking to a lot of my colleagues that I think given the surfeit of energy and speed and lushness in films today, the most revolutionary and the most daring images you can create are very simple images, very directly captured images." Bailey recently had another look at a 1966 Tony Richardson film, "Mademoiselle," shot by David Watkin (in widescreen black-and-white, something of a rarity in itself), in which Jeanne Moreau stars as a sexually repressed schoolteacher who wreaks havoc in her small village, poisoning wells and burning things down.

"It's incredible, but the fascinating thing about it is there's not a single camera movement in the entire film," Bailey says. "All the action happens within a static frame. This film is, like, two hours long, and it's absolutely riveting. It's so unlike anything that you would ever see now."

If you spend enough time going to the movies, you begin to realize that you never quite know what's going to inspire the filmmakers, the cinematographers and the screenwriters of tomorrow. In 10 years, will every mainstream Hollywood feature look like "The Matrix Reloaded"? Is that what audiences will expect and demand?

Maybe. But for every movement, every trend, there's a backlash. For every 10 kids who decide they want to be filmmakers after seeing the Wachowski brothers' action-fest, there might be one strange little tyke who manages to catch "2001: A Space Odyssey" or "Vertigo" or "Night of the Hunter" or "Blue Velvet" and decides she wants to make a movie that looks like that. Change is coming, as it invariably does, and the only hope for people who love movies and care about their future is to bet our money on the strange ones. The revolution will not be televised. No matter how it plays out, it will be coming to a movie screen near you.


By Stephanie Zacharek

Stephanie Zacharek is a senior writer for Salon Arts & Entertainment.

MORE FROM Stephanie Zacharek


Related Topics ------------------------------------------

Movies