Romance novels need a canon
"Bet Me" by Jennifer Crusie
A contemporary romantic comedy set to Elvis Costello and lots of luxurious and sinful sugary treats. Read the whole essay.
Michael DeVita of the University of Pittsburgh recalls making the rounds at a student teaching hospital with his interns in tow when he remembered that he had a patient upstairs who was near death. He sent a few of the young doctors “to check on Mr. Smith” in Room 301 and to report back on whether he was dead yet. DeVita continued rounds with the remainder of the interns, but after some time had passed he wondered what happened to his emissaries of death. Trotting up to Mr. Smith’s room, he found them all paging through “The Washington Manual,” the traditional handbook given to interns. But there is nothing in the manual that tells new doctors how to determine which patients are alive and which are dead.
Most of us would agree that King Tut and the other mummified ancient Egyptians are dead, and that you and I are alive. Somewhere in between these two states lies the moment of death. But where is that? The old standby — and not such a bad standard — is the stopping of the heart. But the stopping of a heart is anything but irreversible. We’ve seen hearts start up again on their own inside the body, outside the body, even in someone else’s body. Christian Barnard was the first to show us that a heart could stop in one body and be fired up in another. Due to the mountain of evidence to the contrary, it is comical to consider that “brain death” marks the moment of legal death in all fifty states.
The search for the moment of death continues, though hampered by the considerable legal apparatus that insists that it has already been found.
Gregory Sorensen, MD, subjected cats to magnetic resonance imaging (MRI) exams at Massachusetts General Hospital in Boston while he killed them with injections of potassium chloride, the same chemical used by Jack Kevorkian and lethal-injection executioners. Sorenson took MRI images while the cats died, watching to see what happened to their brains during the process. He was hoping to photograph the moment of death.
Sorensen says that the idea of “irreversibility” makes the determination of death problematic. What was irreversible, say, twenty years ago, may be routinely reversible today. He cites the example of strokes. Brain damage from stroke that was irreversible and led irrevocably to death in the 1940s was reversible in the 1980s. In 1996 the FDA approved tissue plasminogen activator (tPA), a clot-dissolving agent, for use against stroke. This drug has increased the reversibility of a stroke from an hour after symptoms begin to three hours.
In other words, prior to 1996, MRIs of the brains of stroke victims an hour after the onset of symptoms were putative photographs of the moment of death, or at least brain death. Today those images are meaningless. One can take MRIs for another two hours and still not be sure one is photographing death. What about MRI images taken three hours after the onset of stroke? There is no confidence that that will be the end either. It is safe to assume that medical breakthroughs will continue to make “irreversibility” meaningless.
Sorenson says, “We have not yet quantified when brain death is irreversible.” He adds, “No one wants to volunteer for the experiments.” As for his dead cats, Sorenson said he could see the tissue infracting, but he could never pinpoint the moment of death. “I’m not sure MRI can prove that someone who is dead (or a mummy) won’t come back to life. As a scientist, you simply have to say such events are extraordinarily rare. As a believer, you can say whatever you’d like; I’m a believer, so I do believe that people will live again … but I won’t try to use MRI to convince you of that position.” Sorenson is a nice, friendly guy, and I hope for his sake that God is not a cat lover.
“What’s alive and what’s dead breaks down when we get above the cellular level,” Sorenson says. “Pathologists don’t feel comfortable that a brain is dead until the cell walls break down. True cell death is a daylong process.”
Lawrence Schwartz says cell death is not so simple. Schwartz is a professor of biology at the University of Massachusetts at Amherst. He does not see human beings as the ultimate species, most deserving of study by biologists. Humans are at best a footnote in the history of evolution. “We’re a failed experiment,” he told me.
Cell death is far removed from brain death. As shown, brain death can be declared when only a few brain cells have actually died. Cells in the remainder of the body are alive and kicking. Brain-dead patients being sustained as beating-heart cadavers are still supplying most of their body’s cells with blood and thus oxygen, so total cell death is nowhere in sight. Cell death begins in earnest when the heart stops beating and the lungs cease to breathe. No longer being pumped through the body, the blood will drain from the blood vessels at the top of the body and collect in the lower part. The upper body will become pale, the lower body turning much darker, looking bruised. This is livor mortis.
Even at this point, however, most cells are still not dead. After the heart stops, brain cells will die in a few minutes. Muscle cells can hold on for several hours, and skin and bone cells can stay alive for days. Cells switch from aerobic (with oxygen) respiration to anaerobic (without oxygen) when the blood stops circulating. A by-product of anaerobic respiration is lactic acid, which is what makes your arm muscles hurt during arm wrestling or your legs hurt during a hard run. When you are alive, your blood flow clears out the acid, but in a dead person the body stiffens. This is rigor mortis. Rigor mortis usually begins about three hours after the heart stops and lasts thirty-six hours. Eventually all of the cells die. After rigor mortis come initial decay, putrefaction, black putrefaction, and butyric fermentation. Somewhere in these processes — taking as long as a year, depending on the conditions and the weather — is a moment of death. Where that is may be impossible to determine.
Cell death, though, warns Schwartz, is not always a simple matter. We’ve been talking here about necrosis, the simpler of the two kinds of cell death. It’s messy. The cell membrane begins to leak, water and calcium rush in, and the cell bursts.
There is also apoptosis, a process of programmed cell death in all of our tissues to remove surplus or defective cells. It is estimated that we lose on average about 1 million cells per second to apoptosis (Greek for “falling off”). This type of cell death is inextricably intertwined with and essential to life. Apoptosis is what allows us to have distinct fingers and toes. When we were embryos we had webbed feet and hands. Apoptosis, a kind of cellular suicide, in the tissue between the digits, is what gives us hands capable of playing the guitar or holding a cigarette. “It is a misregulation of cell death,” says Schwartz, “that underlies about seventy percent of human disease.” Cancer, for example, results from the ability of cancerous cells to evade apoptosis and grow out of control.
More dramatic is the role of apoptosis in the metamorphosis of caterpillars, tadpoles, and other animals. Cell death in the caterpillar is massive. Any layman can see that there is a huge difference between a caterpillar, at larval stage, and the moth or butterfly, the “perfect insect” that emerges after metamorphosis. The caterpillar, says Schwartz, is a “walking gut.” It is indeed “very hungry,” as the children’s book author Eric Carle wrote.
A caterpillar increases in weight 10,000-fold in three weeks. The moth, by contrast, is a fucking machine. The gut is no longer needed for food and is now filled with gonad. Some moths don’t have mouths. Eating is not a priority. During metamorphosis, huge numbers of muscle cells die within thirty-six hours. The larva is cannibalized, says Schwartz. The legs are mostly retained, but the brain changes dramatically. The nervous system is retained but rewired. Different neurons are required to send different directions to muscles. After all, the animal has gone from crawling to flying.
The “self” is usually defined as consciousness. Lepidoptera metamorphosis challenges the logic of centering a declaration of death on irreversible lack of consciousness. No biologist claims that the caterpillar dies and a moth is born. It’s the same animal. Yet most of the caterpillar’s cells have died, irreversibly, particularly in the brain, far more than have died in human patients declared brain dead. Does the moth remember its life as a caterpillar? Does it feel as though it’s the same individual as it was during the prechrysalis days? Where is the “self” of the caterpillar?
There is some evidence, though scanty, that some memories may be maintained. Caterpillars were trained to turn to the left or right at the end of a T-maze, the researchers administering electric shocks to the caterpillars to steer them away from the left to the right and vice versa. Following metamorphosis, the adults, the moths, were made to run a similar maze. There appeared to be evidence of learning — turning right or left, according to their conditioning — that survived from caterpillar to moth. The experiments, conducted at the University of Massachusetts at Amherst, have been repeated throughout the years, but a level of skepticism remains that consciousness survives metamorphosis.
My point is that by today’s brain-death standard for humans, a moth would not be considered alive, its “self” having perished in the chrysalis. Even if one acknowledges the right-turn/left-turn memory, it would not be accepted as substantive consciousness by today’s medical establishment, which dismisses such traits as reaction to pain, penile erection, and successful pregnancy as not adding up to a “self.”
As for a moment of death, Schwartz says that with apoptosis, there may be a moment when “the Slinky’s been pushed down the stairs, committed to death,” but with plain-vanilla cell death, necrosis, there is “not a specific moment, the result of a catastrophic event.” I asked him when all the cells in a human body are dead. He said, “That’s a philosophical question.” Then he asked, “Do cells suffer when they die?”
“Personhood” is a word that doctors throw around today as if it were a scientific term. Alan Shewmon believes it has nothing to do with medicine but is rather a moral concept. In 2000, at the Tured International Symposium on Coma and Death, held in Havana, Shewmon presented evidence that some brain-dead patients are still alive, including a video of a patient who, at the time, had been brain dead for thirteen years. (He would die via cardiopulmonary criteria seven years later.) Despite the fact that this boy, who was on a respirator, passed all brain-death criteria, his shoulder twitched, he sprouted goose bumps, and his hand went into spasms when his arm was lifted by the wrist.
Like other brain-dead patients, he healed from his wounds while supposedly dead, and he continued to grow. Gary Greenberg, a writer who covered the symposium for The New Yorker, reported that no one took issue with Shewmon’s science, but doctor’s continued to say that brain death was valid because the “person” was missing from such bodies. Most revealing was Fred Plum, the neurologist responsible for the term “persistent vegetative state,” who immediately challenged Shewmon at the end of his presentation: “This is anti-Darwinism. The brain is the person, the evolved person, not the machine person. Consciousness is the ultimate. We are not one living cell. We are the evolution of a very large group of systems into the awareness of self and the environment, and that is the production of the civilization in which any of us lives.”
There are so many things going on in Plum’s statement that it is hard to decide where to begin. The implication here is that we humans are more complicated than other life-forms, that we have consciousness (and they don’t, or not very much), and thus we have a higher bar to hurdle to maintain our personhood. Therefore we need a lower standard of death than chimps or amoebas do.
But is this really Darwinism? I asked Janet Browne, a Harvard historian of science and the preeminent biographer of Charles Darwin. To Plum’s statement she responded, “This is not really anything that Darwin would have given any thought to. However, he certainly believed that there was close connecting links between the animal and human and that humans were not special.” I am somewhat glad that Plum is dead, and doesn’t have to read Browne’s statement. She continues, “Yet in the ‘Descent of Man’ he [Darwin] also agrees with most of his readers that humans have more developed brain and culture than animals. This is not an absolute difference, merely one of degree.” Darwin’s name today is invoked almost as often as God’s on topics that neither would have given any thought to.
Plum claims that “when the cognitive brain has departed, the person has departed.” This might be a common belief among MDs, but the location of the seat of “personhood” or “soul” or “the self” or whatever has been debated without closure for thousands of years. And there is ample evidence that the “person” is not contained wholly in the brain.
Candace Pert, the discoverer of the opiate receptor in the brain, says that there has been a new paradigm in neuroscience since about 1995. More than three hundred common molecules, chemicals, are found in the brain, the immune system, and bone marrow. In other words, brain chemicals partly responsible for consciousness are being found all over the body. When Pert says, “The body-mind is one,” she’s speaking not as a Buddhist but as a biochemist, though Buddhism, she says, may have anticipated this discovery. “Consciousness,” she adds, “is a property of the entire body.” Just do this mind experiment: Take the brain of Michael Jordan and put it into the body of Woody Allen. Or vice versa. Do you think Jordan’s “personhood” would have turned out the same if contained in Allen’s body? It’s an old mystery, and we will not solve it here.
The brain-death revolution has been driven by organ transplantation but justified in part by the modern form of evolution known as neo-Darwinism. Famed evolutionary biologist Lynn Margulis pointed out that neo-Darwinism borrows heavily from creationism for its structure. The creationists see life-forms organized into a tree or pyramid, with humans at the top, and God presiding over it all. Neo-Darwinists have a similar tree with humans at the top and all other life-forms below, with natural selection sitting in for God. Margulis envisioned life in a horizontal structure, more of a bush than a tree, with no species “more evolved” than another. “There is no crown of creation,” she said.
Browne agrees: “[Darwin] differed from Lamarck, or even from his evolutionary grandfather Dr. Erasmus Darwin, in that he eschewed any doctrine of ‘necessary progression’ and inner striving towards perfection … he always regarded the chief difference between [Lamark and him] to be that he, Darwin, did not allow his organisms any future goal, any teleology pulling them forward, or any internal force that might drive the adaptive changes in specific directions. On the contrary, Darwin’s scheme of evolutionary adaptation was based entirely on contingency. Organisms shifted randomly.
The idea of a hierarchical natural world, deterministic and purposeful, with man at the top, is essential to our system of separate death criteria for humans as opposed to all other animals. Margulis called it anthropocentric. “Other animals do not have language,” Margulis whimsically pointed out, “and are therefore inferior because they cannot speak of their superiority.” The belief that life is rigorously and logically organized and that natural selection proceeds in a rational matter also seems at odds with the original Darwinism of Darwin. Cell-death expert Schwartz says, “There is no grand plan.” In fact, if you look at the way organisms work and are structured, he says, “You wouldn’t do it this way. It’s a Rube Goldberg design.”
Shewmon, though not opposing the discontinuation of life support for severely disabled patients, points out that Plum’s statement about the self illustrates that doctors are not making medical judgments but rather moral judgments about who deserves to live or die.
Let me apologize to doctors before I write what I am about to write. All of my doctors have been terrific, most of them nerds, and many clueless about all things human. They have spent the first part of their lives studying endlessly, the second half working hard to learn their craft and pay back their student loans. Recently, while being prepped for a procedure, I heard the doctor in charge wrapping up a phone call. He was clearly divorced and explaining to someone why he couldn’t take his kids for the weekend. Good, I thought, A man who neglects his family is more focused on his job.
But are doctors the people we want to be in charge of determining “personhood”?
In 1989 my father suffered a heart attack followed by a stroke. We are not a sentimental bunch; my dad and I spoke on the phone once a year on his birthday. I flew from Massachusetts to Minneapolis to say good-bye. I found him in the ICU. Upon seeing me, he propped himself up, shook my hand, and said, “Thank you for coming.” We had a short talk. It wasn’t of Oscar Wilde quality, but it qualified as human-to-human conversation, I thought. He wanted to know if I liked Massachusetts. I had just moved there from Manhattan.
A young doctor interrupted us, and I left the room. The doctor approached me later across the hall, and said, “I’m sorry, but your father’s not going to regain consciousness before he dies.” I mentioned, guardedly, that I had just talked to him. The doctor shook his head sadly. “No. You thought you talked to him. He probably made some reflexive sounds, and you interpreted it as speech.” He said not to worry, that many family members delude themselves in this manner. I felt embarrassed, and I believed him.
As a science writer I had followed the “talking chimp” controversy of the 1970s, the researchers of Washoe, Koko, et al. fooling themselves into believing that they were having meaningful conversations with their primate subjects. those scientists were smarter than I. I could certainly, out of wishful thinking, believe that my father had spoken to me when he was just making unconscious, meaningless sound.
I saw a second, older doctor, my father’s internist, enter the room. My father opened his eyes, propped himself up again, and shook the man’s hand. I was several yards away, out in the hallway, but the two men appeared to be having an animated conversation. When the doctor exited, I said, “It looked like you were talking to him.” The doctor said, “Sure. I talk to Cliff every day.” I explained what the previous doctor had told me and pointed out the man, who was still on the floor. “Oh,” said the older doctor, “that’s a neurology resident. They teach them that in medical school today. Everybody is dead or in a coma.”
Excerpted from “The Undead” by Dick Teresi. Copyright © 2012 by Dick Teresi. Excerpted by permission of Knopf, a division of Random House, Inc. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Dick Teresi is the author or coauthor of several books about science and technology, most recently "The God Particle" and "Lost Discoveries: The Ancient Roots of Modern Science." He has been editor in chief of Science Digest, Longevity, VQ and Omni, of which he was also a cofounder, and has written for The New York Times, The Wall Street Journal, Smithsonian, The Atlantic and Discover, among other publications.More Dick Teresi.
"Bet Me" by Jennifer Crusie
"Welcome to Temptation" by Jennifer Crusie
Another of Crusie's romantic comedies, this one in the shadow of an ostentatiously phallic water tower. Read the whole essay.
"A Gentleman Undone" by Cecilia Grant
A Regency romance with beautifully broken people and some seriously steamy sex. Read the whole essay.
"Black Silk" by Judith Ivory
A beautifully written, exquisitely slow-building Regency; the plot is centered on a box with some very curious images, as Edward Gorey might say. Read the whole essay.
"For My Lady's Heart" by Laura Kinsale
A medieval romance, the period piece functions much like a dystopia, with the courageous lady and noble knight struggling to find happiness despite the authoritarian society. Read the whole essay.
"Sweet Disorder" by Rose Lerner
A Regency that uses the limitations on women of the time to good effect; the main character is poor and needs to sell her vote ... or rather her husband's vote. But to sell it, she needs to get a husband first ... Read the whole essay.
"Frenemy of the People" by Nora Olsen
Clarissa is sitting at an awards banquet when she suddenly realizes she likes pictures of Kimye for both Kim and Kanye and she is totally bi. So she texts to all her friends, "I am totally bi!" Drama and romance ensue ... but not quite with who she expects. I got an advanced copy of this YA lesbian romance, and I’d urge folks to reserve a copy; it’s a delight. Read the whole essay.
"The Slightest Provocation" by Pam Rosenthal
A separated couple works to reconcile against a background of political intrigue; sort of "His Gal Friday" as a spy novel set in the Regency. Read the whole essay.
"Again" by Kathleen Gilles Seidel
Set among workers on a period soap opera, it manages to be contemporary and historical both at the same time. Read the whole essay.