Should we erase painful memories?

"Eternal Sunshine of the Spotless Mind" could soon become a reality -- but the concept raises some thorny questions

Published December 31, 2011 9:00PM (EST)

This article was adapted from the upcoming book, "Memory: Fragments of a Modern History," available in January from the University of Chicago Press.

One of the most tenacious themes of 20th-century memory research was the idea that people tormented by the memories of terrible experiences could benefit from remembering them, and from remembering them better. The assumption — broadly indebted to psychoanalysis — was that psychological records of traumatic events often failed to be fully “integrated” into conscious memories. As long as these records remained “dissociated,” the sufferer was compelled to “relive” them instead of benignly remembering them. The more fully and appropriately one remembered terrible events, the more attenuated would be their emotional power.

But in the 1990s — a time when psychoanalytic assumptions were being challenged as never before — neuroscience researchers developed a new framework for thinking about remembering, forgetting, and the mind’s record of past events. One result was a highly controversial new paradigm for treating traumatic memories. The problem with bad memories, these new researchers claimed, is not their complex and unresolved relation to one’s sense of self, but the simple fact that they are unpleasant. These researchers defined emotional memory not in terms of repressed ideas, but by certain patterns of neuron action and the chemical changes they triggered. The next step was to change these patterns.

In the 19th century, character was commonly portrayed as something that was built up by daily experience and personal choices, through the memories and habits created by those everyday events. Character was the result of how one responded, moment by moment, to the challenges of daily life, because those responses built up a kind of internal machinery of habit. Character was defined, in a way, as an accretion of memory. The idea of “building character” meant striving to make appropriate choices because the hardware one created would become difficult or impossible to change later.

One scholar of the present day who has expressed concerns about memory dampening has used a different analogy to describe the relation between memory and personality, but nevertheless one that describes personality as being made out of discrete memories. William B. Hurlbut, a consulting professor in human biology at Stanford University, wrote that “the pattern of our personality is like a Persian rug.” It was built “one knot at a time, each woven into the others. There’s a continuity to self, a sense that who we are is based upon solid, reliable experience. We build our whole interpretation and understanding of the world based upon that experience or on the accuracy of our memories.”

As recently as the 1990s, people who thought of themselves as survivors of terrible trauma often defined themselves in relation to what they remembered (or what they did not): They were survivors because they had survived certain defining events.Their character as mature adults came from “working through” these terrible memories. But there were also “survivors” who felt they had not truly survived their memories. They described a wounded self whose bad experiences stood in the way of personal realization. This latter convention involved the idea of a hidden, unimpaired self encumbered by adverse conditions. The idea is similar in some respects to the characterization used in the marketing of recent psychotropic drugs, especially Prozac. These drugs’ enthusiasts sometimes declared that taking them allowed their “true” selves to emerge, often for the first time.

Philosophers, therapists, filmmakers and bloggers have been quick to reflect on the implications of memory erasure. One of the best-known projects to explore the subject is the 2004 film "Eternal Sunshine of the Spotless Mind," in which the main character attempts to have memories of his ex-girlfriend deleted from his mind pharmaceutically. The film embraces the new neuroscience of emotion, focusing on memories of feelings and the complex ways different kinds and parts of memories are stored in different places in the brain. Central to the plot is the idea that memories with different emotional associations are stored differently.

Emotions associated with a past event can be stored independently of the “data” of the event itself, so that someone with amnesia about a particular event that took place, for instance, at a certain location could feel an echo of the emotion associated with that location in spite of not recalling what happened. The idea of reconsolidation is central not only to the basic plot (the lead man is made to remember his girlfriend so that memories of her can be systematically removed) but also to one particularly important element of the movie, in which some part of the consciousness of the main character becomes aware of the memory erasure as it is happening, and he changes his mind. Because he is anesthetized, he is unable to tell the operator to stop the procedure.

What follows evokes the idea of a Memory Palace — the medieval technique of committing things to memory by visualizing each piece of information as a room in a great palace. To recall a specific piece of information, one would imagine walking through the appropriate room. In the film, the lead character and his memories conspire together to save the memory records by stashing them in unlikely places on the palace grounds.

The first speculative steps are now being taken in an attempt to develop techniques of what is being called “therapeutic forgetting.” Military veterans suffering from PTSD are currently serving as subjects in research projects on using propranolol to mitigate the effects of wartime trauma. Some veterans’ advocates criticize the project because they see it as a “metaphor” for how the “administration, Defense Department, and Veterans Affairs officials, not to mention many Americans, are approaching the problem of war trauma during the Iraq experience.”

The argument is that terrible combat experiences are “part of a soldier’s life” and are “embedded in our national psyche, too,” and that these treatments reflect an illegitimate wish to forget the pain suffered by war veterans. Tara McKelvey, who researched veterans’ attitudes to the research project, quoted one veteran as disapproving of the project on the grounds that “problems have to be dealt with.” This comment came from a veteran who spends time “helping other veterans deal with their ghosts, and he gives talks to high school and college students about war.” McKelvey’s informant felt that the definition of who he was “comes from remembering the pain and dealing with it — not from trying to forget it.” The assumption here is that treating the pain of war pharmacologically is equivalent to minimizing, discounting, disrespecting and ultimately setting aside altogether the sacrifices made by veterans, and by society itself. People who objected to the possibility of altering emotional memories with drugs were concerned that this amounted to avoiding one’s true problems instead of “dealing” with them. An artificial record of the individual past would by the same token contribute to a skewed collective memory of the costs of war.

In addition to the work with veterans, there have been pilot studies with civilians in emergency rooms. In 2002, psychiatrist Roger Pitman of Harvard took a group of 31 volunteers from the emergency rooms at Massachusetts General Hospital, all people who had suffered some traumatic event, and for 10 days treated some with a placebo and the rest with propranolol [a beta blocker]. Those who received propranolol later had no stressful physical response to reminders of the original trauma, while almost half of the others did. Should those E.R. patients have been worried about the possible legal implications of taking the drug? Could one claim to be as good a witness once one’s memory had been altered by propranolol? And in a civil suit, could the defense argue that less harm had been done, since the plaintiff had avoided much of the emotional damage that an undrugged victim would have suffered? Attorneys did indeed ask about the implications for witness testimony, damages, and more generally, a devaluation of harm to victims of crime. One legal scholar framed this as a choice between protecting memory “authenticity” (a category he used with some skepticism) and “freedom of memory.” Protecting “authenticity” could not be done without sacrificing our freedom to control our own minds, including our acts of recall.

The anxiety provoked by the idea of “memory dampening” is so intriguing that even the President’s Council on Bioethics, convened by President George W. Bush in his first term, thought the issue important enough to reflect on it alongside discussions of cloning and stem-cell research. Editing memories could “disconnect people from reality or their true selves,” the council warned. While it did not give a definition of “selfhood,” it did give examples of how such techniques could warp us by “falsifying our perception and understanding of the world.” The potential technique “risks making shameful acts seem less shameful, or terrible acts less terrible, than they really are.”

But the mere possibility seems to have threatened an important convention for representing memory in relation to personal identity. These worries draw their force from a deep-seated attachment to two related beliefs: first, that we are, in some ambiguous but important way, the accretion of our life experiences; and second, that those life experiences are perfectly preserved even if our ability to remember them is far from perfect. When Alzheimer’s disease patients lose significant amounts of memory, dismayed friends often say that their very selves have crumbled or faded away and that in some literal way they are “no longer themselves.”

The thought here is not that people believe their memories are perfect — far from it. Common understandings of memory centrally involve the idea that memories are unreliable, fickle and capricious. But there is another belief about memory that has been articulated by many figures in memory research: that in some fundamental way, secreted within us are perfect records of past experiences, even if we might never access them consciously.

The concerns of the President’s Council naturally raised the question of how to distinguish psychic pain from physical pain, since it was this very distinction that the council failed to discuss explicitly. The council did not set itself against the use of physical anesthesia, even though there are well-known cases where pain is thought to serve a useful purpose. Yet physical anesthetics have some relevant common ground with the prospective memory technologies, and there is a long history of resistance to the idea of physical anesthesia on grounds similar to some of the arguments being mounted here. Skeptics argued that a loss of sensation would disconnect sufferers from a valuable experience (as in childbirth) and from information they needed to have. Before the advent of anesthesia, techniques that seemed to involve an intentional suspension of sensation could trigger alarm on a scale that now seems almost inconceivable.

For instance, in the 1830s, during disputes over whether mesmerism could create an altered state of mind in which an individual was entirely incapable of sensation, the editor of a major London medical journal urged his readers to consider such a thing impossible not merely because it was implausible but because it would be an immense moral affront and a threat to one’s personal integrity. “Consider the implications,” he urged his readers: “the teeth could be pulled from one’s head without one’s knowledge.”

The 1840s and 1850s saw a dramatic shift in the status of physical pain, specifically in its relation to self-knowledge, self-control and personal integrity. Anesthesia was initially upsetting because it challenged a convention for thinking about personal identity and self-control to which full sensory experience was central. Viewed from the era of anesthesia, this anxiety has come to look quaint or even inexplicable. In this era the suspension of sensation is taken to be radically distinct from how we understand ourselves. Could the same happen about remembering? The neurosciences of the present day are inaugurating a period of uneasy speculation about moral and identity issues that promises to be the most profound and far-reaching of all — at least until the next revolution in the memory sciences.

Reprinted with permission from "Memory: Fragments of a Modern History," by Alison Winter. Published by the University of Chicago Press. Copyright 2012. University of Chicago. All rights reserved.


By Alison Winter

Alison Winter is an associate professor of history at the University of Chicago and the author of "Mesmerized: Powers of Mind in Victorian Britain," also published by the University of Chicago Press.

MORE FROM Alison Winter


Related Topics ------------------------------------------

Neuroscience Science