With the release of last August's ill-received "Simone," moviegoers were teased with the promise of a look at Hollywood's future. The film's promotional materials suggested that the film would focus on the long-controversial topic of "synthespians," computer-generated actors who would be capable of replacing their human counterparts.
To the disappointment of critics and audiences alike, this turned out to be little more than a gimmick: The film focused not on the world's reaction to the first virtual cinema star, but on the attempts of a second-rate director (Al Pacino) to conceal her "true nature," lest his work be written off as a fraud. In one of the film's accidentally philosophical moments, the director attempts to convince himself that "if a performance is genuine, it doesn't matter if the actor is real or not." And, though "Simone" never gave audiences cause to consider this claim further, recent developments have made the proposal one that can no longer be ignored. The real worth of "Simone" is found in the unanswered question that it poses: Even if a celebrity were "virtual," would there be any reason to hide it?
Well, here's one reason: We don't know how audiences will react to a performance that owes more to computer code than to human talent. Perhaps, as "Simone" suggests, it shouldn't matter whether an actor is real or not, but with the announcement of the nominees for the 75th Academy Awards last Tuesday, it has become clear that -- at least for now -- it still does. The most interesting question of this year's Oscar race was not who would be named, but whether Hollywood was prepared to adjust traditional definitions of talent and achievement to honor the first wave of computer-generated performers.
The question arose in response to public statements made in December that New Line Cinema would seek a best supporting actor nomination for Andy Serkis, the human actor whose talents were tapped to guide the computer-generated performance of the pitiful wretch, Gollum, in "The Two Towers," the second installment of the epic "Lord of the Rings" trilogy. And while Gollum's performance is nothing short of breathtaking, earning praise from critics as the most impressive aspect of the altogether overwhelming "Towers," it was not clear whether audiences or Academy members were prepared to see Oscar honors go to an actor whose face they have never seen. Is Hollywood ready to acknowledge and honor digital performances, or even human-digital hybrids? This year, the answer seems to be a resounding no. When the nominees for best supporting actor were named on Tuesday, Serkis was not among them.
Even to those who believe that Serkis gave an Oscar-worthy performance, the decision doesn't come as much of a surprise. New Line, and the "Rings" production team, recognized the difficulty they faced in gaining a nomination for Serkis' Gollum, whom audiences might be tempted to mistake for an improved descendant of "Star Wars'" much-reviled Jar-Jar Binks. "Rings" executive producer Mark Ordesky has explained in interviews that, unlike the other synthespians audiences have encountered, "Gollum is groundbreaking, because he's not only CGI, but is actually a performance-based character. He's not comic relief, he's not an antic. He really is a major dramatic character."
According to "Rings" director Peter Jackson, the most significant difference between Serkis' Gollum and the computer-generated actors that precede him is Serkis himself. Prior to "The Two Towers," all digital characters were developed mainly in postproduction: designed, refined and then inserted into completed sequences alongside actors who had delivered their lines to on-set substitutes constructed from broomsticks, tennis balls and the like.
For Gollum's performance, which offers much of the emotional and dramatic resonance in "The Two Towers," Jackson found such techniques insufficient. Instead, he brought in Serkis to perform Gollum's part on the set in real time alongside the other members of the "Rings" ensemble, in an attempt to capture the flow and pace of genuine interaction. Serkis then spent months in postproduction with the visual effects team re-creating his performance, one scene at a time, in an elaborate motion-capture animation studio that recorded his most minute movements, gestures and expressions. The final animation of Gollum was generated from this recorded model, and painted into the film in place of Serkis.
"What was important," Jackson explains, "is that there was one person, an experienced, skilled actor, making all of the decisions on behalf of Gollum. [Andy] would decide how Gollum would move, how he would act, what emotion he would have, what pauses he would put where, what weight he'd put into a particular scene -- just as any actor, like Elijah and Sean, would be doing for their characters."
Although such explanations make a fascinating and compelling case for honoring Serkis' contribution to the film, the idea of awarding an Oscar nomination to an actor who never appeared on-screen is still difficult to swallow. But Jackson and Serkis have pointed out a significant parallel between their own situation and that of David Lynch's 1980 release "The Elephant Man," which received eight Oscar nominations. Most significant among these was a best actor nomination for John Hurt, who performed the haunting title role beneath elaborate prosthetics that left only his eyes visible. As Serkis has explained in interviews, "[Hurt] gave a voice and a physicality, but was completely disguised by the prosthetics, and this in many ways is similar." In Gollum's case, New Line argues, the prosthetic is computer-generated instead of physical, but the question is the same: Does the performance belong to the actor who brings a character to life, or to the production team that gives the character its form?
While Jackson and Serkis have pointed to "The Elephant Man" for precedent, perhaps they do not realize just how instructive their example is. Although nominated for best actor, John Hurt did not win in 1980, a loss he has partly attributed in recent interviews to the fact that the audience could not recognize him at all. It seems almost certain that Serkis, had he been nominated, would face the same problem. The more interesting and less noted parallel, however, is that which exists in the Academy's treatment of the technical contributions made by each film. While it seems certain "The Two Towers" will claim this year's honors for best visual effects, "The Elephant Man's" groundbreaking contributions in the field of cinematic makeup and prosthetics, surprisingly, went unrecognized by the Academy. Why? Because in 1980, the Academy had not yet instituted an award for best achievement in makeup, a category that was announced in 1981 as a direct result. The technical accomplishments of "The Elephant Man" were never honored, an oversight that forced the Academy to revise its own structure to accommodate and honor an emerging field.
Perhaps it is too soon to suggest that the Academy take similar measures to create an award honoring computer-enhanced human performances, though the upcoming releases of Ang Lee's "The Hulk" and the all-computer-generated feature "The Polar Express" might prove otherwise. (It is interesting to note, however, that the Broadcast Film Critics Awards now include an annual award for best digital performance.) In either case, the incidents of "The Elephant Man" and Gollum both seem to suggest a greater problem: that the Academy has never been well prepared to evaluate or honor groundbreaking innovations and accomplishments in cinematic production. In most cases, the films that first utilize innovative techniques or approaches to moviemaking rest in unmarked graves, anonymous soldiers who paved the way to recognition for their often-inferior successors.
The truth is that it's never been altogether clear who "owns" a film performance, and the issue has become even less clear as cinematic production techniques become more and more advanced. Onstage, lighting, makeup and costuming affect performances, but the actor ultimately retains control over how he or she is presented to the audience. In film, this has never been the case: Long after the actor has left the set, directors and producers make decisions that determine what audiences will and will not see, and how they will see it. The question becomes one of where to draw the line: Does the application of a "virtual prosthetic" represent a more significant alteration of an actor's performance than a director accomplishes through editing?
As digital effects become increasingly prevalent in cinema, this question of performance ownership will become impossible to ignore, and audiences will be forced to decide whether it is the performer or the performance that is worth evaluating.
The initial question, of course, remains unanswered. Should Serkis have received a best supporting actor nomination for his contribution to the performance of Gollum? In the end, the answer is no, not because his talents are less significant than those of the supporting actor nominees, but because the work that he has done here is not equivalent. It would be a disservice to the other nominees to compete against the computer-enhanced Serkis, just as it would be a disservice to Gollum to be written off as an accomplishment of acting. The fact is that Gollum represents a new breed of synthespian performers, far more interesting than "Simone's": computer-generated performance not as a replacement for human performance, but as an extension of it.
Ivan Askwith is a senior at the Gallatin School of Individualized Study at New York University, graduating this spring with a focus in social philosophy and technoculture. MORE FROM Ivan Askwith
COMPLETELY AD FREE,
FOR THE NEXT HOUR
Read Now, Pay Later - no upfront
registration for 1-Hour Access
7-Day Access and Monthly
Subscriptions also available
No tracking or personal data collection
beyond name and email address