For video games, a moral reckoning is coming

As games get closer to complete realism, developers have to decide whether to use that power for good or evil

By Kevin Wong
Published August 17, 2013 11:30PM (EDT)
Max Payne 3      (AP/Rockstar Games)
Max Payne 3 (AP/Rockstar Games)

She was created with a computer program, but she looked real. The proportions were correct, the hair looked lifelike – the skin even had pockmarks and imperfections. For some reason, however, it felt a bit off. Maybe it was the eyes, maybe it was the way she moved, but the overall effect was, in a word, creepy.  This phenomenon is called the "uncanny valley," and for some game developers, it’s the final barrier between fantasy and reality.

The "uncanny valley" refers to one’s psychological response to a visual representation – say, for example, a character in a video game. As the visual representation becomes more realistic and complex, a player’s psychological response becomes more positive – he or she begins to identify with the character’s human qualities. At a certain point, however, the dynamic shifts – the more lifelike the character is, the more unsettling that character becomes – and the player will feel disgust. Unless the character is a completely flawless rendition; then it's outside the valley and everything's fine.

The human eye is very good at detecting fakeness, and for a long time developers confronted this dilemma; they were able to create characters that were lifelike, but not perfectly lifelike. Technology, however, is moving at an exponential rate, and developers may soon escape the uncanny valley, creating something virtual that looks like flesh and blood.

What issues does this raise, at the point where artificial and real become muddled? And what responsibilities do developers have towards players if they decide to simultaneously push the boundaries of technology and gratuitousness?


Violence in video games has always been a given – in Contra, released on the Nintendo Entertainment System in 1987, a player killed hundreds of virtual people with a variety of machine guns. In early games, however, the violence, no matter how ubiquitous, was strangely quaint. It was difficult to be viscerally affected by such rudimentary graphics: a bunch of pixels shooting a pixel at another bunch of pixels.

Only five years later, Mortal Kombat was released in arcades. It was bloody, with dramatic ways of killing your opponent, but what set Mortal Kombat apart from other games was its photo-realism – real actors’ heads and bodies represented playable characters. Digitized voices screamed out when characters got hurt or killed. This wasn’t a matter of square pixels – blood was beginning to look like blood.

Fast-forward to the present day, and there’s no question that violence in gaming can be painful and tasteless. Take 2007’s Manhunt 2, where a main objective is to execute one’s enemies with a variety of instruments – a pair of scissors, a bat, a scythe and a hacksaw, to name just a few. Or, take 2012’s Max Payne 3; sure, the player is shooting bad guys the entire time, but the game lingers on its violence in near-pornographic fashion. There are zoom-in shots, freeze frames, slow-mo splatters and low-shot angles of every major kill. Realism is one thing, but context also matters; no one would argue that this romanticizing of violence is necessary to creating a believable experience.

And then there are sandbox games, which give players the free will to do wrong. In 2010’s Red Dead Redemption, the player ties a prostitute to the railroad tracks. When she’s ground up by the oncoming train, the player earns a trophy for this specific "achievement."

No, a player doesn’t have to do this in order to win the game. But that doesn't really matter. The act of putting it in the game to begin with is where the moral justification fails. It’s not a space marine killing aliens, or a plumber stomping on cartoonish turtles – it’s a person killing people, and it's supposed to look real.


Studies on violent video games have not supported the notion of psychological damage. Human empathy develops during the first years of life, way before the person can even pick up a controller. It continues to develop during childhood and adolescence as a result of many factors, such as parental relationships, environmental variables and an assortment of life experiences. To target video games as the main culprit of violent behavior seems simplistic at best.

“On one hand, it seems common sense to think that exposure to violence can desensitize the observer,” says Dr. Jean Decety, professor of psychology and psychiatry at the University of Chicago. “On the other hand, humans are very resilient and adapt quickly to different social contexts. Most of the soldiers torturing detainees in Iraq were probably caring people when they interacted with their loved ones. Probably a minority of them were psychopaths and enjoyed doing so, but this is a minority. I think that most healthy individuals do not confuse games and simulations with reality.”

“There is no solid evidence from brain research that video games lead to antisocial behavior,” Decety continued. “It is one thing to say that video games impact brain circuits – they do – but such a response is transient.”

“I am not sure that [realism in games] will lead to any sort of long-term damage of any sort, once we control for individual differences and potential vulnerability,” Decety says. “Individual differences in empathy will likely outweigh any type of effect from a particular video game.”

So, we’re left with the ethical principle of the issue, something that critics of video games rarely talk about. Realistic depictions of violence, without moral justification, are distasteful – although video games may not directly create violence, they can contribute to a culture of violence, where depictions of pain and suffering become commonplace and acceptable. And furthermore, why not aspire to something higher than a visceral thrill? Game developers should use violence, if they use it at all, to serve an ethical narrative, rather than making it an end in itself.


For David Cage, the founder and creative mind behind game development company Quantic Dream, narrative has always been the top priority. Cage’s last game, 2007’s Heavy Rain, was visually groundbreaking for its time, earning critical praise for its expressive characters. But more importantly to Cage, it meditated upon complex themes more often seen in film: the love between a father and son, moral quandaries about the worth of a single life, and the emotional cost of trusting others.

“[Early] films started with violent scenes, because the technology didn’t allow for anything subtle,” Cage says. “This is exactly where video games are today. We have not yet fully understood that technology is ready for more meaningful experiences. We need to have talent and something to say, which is unfortunately rarely the case.”

Cage was the buzz at this June’s Electronic Entertainment Expo when he presented a new tech demo for the Playstation 4. Tech demos show off a game developer’s graphics capability; last year, Cage’s award-winning demo was Kara, an emotional meta-commentary on a robot that gains the ability to feel. This year’s demo was The Dark Sorcerer, starring a computer -generated sorcerer and his goblin apprentice and playing out in real time. A hackneyed fantasy narrative eventually reveals itself to be a film set: the CGI characters are revealed to be actors, portraying the roles of a sorcerer and a goblin. It was weird and creative, but most importantly, it was funny, and the humor came not only from the words the characters were reciting, but also from the gestures and facial expressions – the actions and reactions that sprang from the back-and-forth rhythm of a conversation. Non-verbal communication – this was new for a medium that has relied on the most blunt of narratives – fists to faces, bullets to bodies – to tell the majority of its tales.

Cage’s upcoming game, Beyond: Two Souls, portrays 15 years in the life of its female protagonist, tracing her struggles from childhood to young adulthood. The game places emphasis upon choice and emotional conflict, and its realism gives texture and subtlety to these concepts. Realism, when applied judiciously, can affirm morality, and the modern game developer has the ethical choice of whether or not to elevate the discourse. For Cage, the decision has always been clear.

“Sometimes I am surprised at the little sense of responsibility that can be seen in some games,” Cage states. “Some of them give the feeling that they were made by a bunch of teenagers laughing out loud as they were making it. I am dreaming of a day when video games won’t need violence anymore to create interesting experiences. We will then have some credibility beyond our little world and be respected as adults in an interesting, creative medium.”

“Games have no choice if they want to continue to exist in the coming years," he adds. "They will have to grow up.”

Kevin Wong

Kevin is an AP English Language teacher and freelance writer from Queens, NY. His focus is on video games, American pop culture, and Asian American issues. Kevin has also been published in VIBE, Complex, Joystiq, Gawker Media, PopMatters, WhatCulture, and Racialicious. You can email him at and follow him on Twitter @kevinjameswong.

MORE FROM Kevin WongFOLLOW kevinjameswong

Related Topics ------------------------------------------

Editor's Picks From The Wires Realism Video Games Violence