Romance novels need a canon
"Bet Me" by Jennifer Crusie
A contemporary romantic comedy set to Elvis Costello and lots of luxurious and sinful sugary treats. Read the whole essay.
Peter Cooper stood at the front of the San Diego Westin Hotel’s Plaza Room and methodically worked his way through the 132 slides of his presentation, “Integrated Access to Complex Genomes at the National Center for Biotechnology Information (NCBI).”
Cooper — a slender man, perhaps 40 years old, with a trimmed beard — spoke with an easy academic confidence to a hybrid audience of computer geeks and molecular biologists at the second annual O’Reilly & Associates Bioinformatics Technology Conference. The slide on the screen behind him said something about the Whole Genome Shotgun in Genbank and about Sequence Tagged Sites that pinpoint the segment of gene, EST, mRNA or genomic DNA of a known microsatellite position. I sat in the back row, near an exit, and tried not to have a panic attack. My career as a moral philosopher was only 6 hours old but already I could smell it going up in smoke.
I had come to San Diego looking for a deep understanding of the meaning of life, which I was planning to bottle and sell at a decent markup. I wanted to address the questions that have been implicit ever since Rosalind Franklin’s crystallography revealed God’s schematics for James Watson and Francis Crick to decode: What is a human being? What is our worth? Who decides?
But all I was getting was science, and that was bad. Now, science is OK, of course, for people who like that kind of thing. But I had come to the O’Reilly conference on a self-financed gambit to elbow my way into the moral-philosophy racket. I figured that if Bill Bennett and similar gasbags could make millions spouting mere platitudes and recycled 15th century theology, then I should be able to make a few large, at any rate, for genuine insight into ethical quandaries occasioned by 21st century breakthroughs in biological science.
Especially in 2003, the golden jubilee of Watson and Crick’s seminal eureka. It was the proverbial Klondike opportunity, but in order to cash in I would need to come up with some salable wisdom that was predicated on an actual understanding of current DNA technology. So that’s why I was sitting in Dr. Cooper’s lecture and waiting for profound insight.
Maybe I should mention that I was also pretty broke at the time. I might also mention that two of my three children have different rare, chronic, severe, incurable congenital life-threatening diseases, and that my third child has Lyme disease. So: three children; three rare, horrible diseases — I know what it is like to pray for medical breakthroughs. And I’ve been marginally active in the disability rights movement for 20 years.
As Cooper droned on about curated transcripts and proteins of human, mouse, fruit fly, zebrafish and arabidospis –whatever that is — I was imagining myself once again a graveyard-shift pallet jockey in Island Food Products’ refrigerated warehouse. There I was, on assignment from Salon, participating in a scientific conference at the virtual epicenter of the biodigital convergence — where DNA and digital computational paradigms were ineluctably fusing before my very eyes — and all I could think about was how goddamn, goddamn heavy liquid ice cream mix is at 3 in the morning, and how little one gets paid for moving it. Pallet-jockeying is honorable labor, mind you, but I’m 50 years old and I’m kinda sick of working with my back.
And this was the year! That’s why Dr. Watson’s book, with the World’s Most Obvious Title, despite being a little large for the beach bag, was required summer reading for anybody at all hip to the Next Big Thing in evolution. My first step on the road to moral philosophy was clear: write a story exploring the inherent tension between the drive to eliminate unwanted genetic conditions and the imperative of living human persons who have those conditions to claim and maintain their full, perfect, humanity.
My editor was supportive. “But remember,” he said. “The last thing the world needs is another article full of hand-wringing about DNA. I want facts, not whining.” “Gotcha,” I replied. “Facts!”
Those words came back to me as I sat listening to Dr. Cooper explain that the ATP binding cassette transporter superfamily contains membrane proteins that translocate a wide variety of substrates across extra-and intracellular membranes, including metabolic products, lipids and sterols, and drugs. Facts, not whining.
There were facts aplenty at the O’Reilly Bioinformatics Conference. Hundreds of millions of facts, as it turns out: every damn C or A or T or G for every damn critter under the sun was there; I felt like I was drowning in facts. And there were facts at the Whitehead Institute for Biomedical Research — that part of the distributed Mecca of the Human Genome Project that’s located in Cambridge, Mass. — where I went next. And at the Hastings Center, the bioethical think tank in Garrison, N.Y.
And there were facts in Lowell, Mass., where I spoke with the prime mover of the open bioinformatics movement. And there were facts at the Federation for Children with Special Needs, whose founder, an old friend of mine named Betsy Anderson, was a charter member of the Ethical, Legal, Social Implications team of the Human Genome Project. And at the Perkins School for the Blind, where multiply handicapped kids find asylum from a world obsessed with perfection.
Facts were everywhere. But where was the genuine insight into ethical quandaries?
In the spring of 1978 I was in love with this hot scientist who was doing research on shifted reading frames in single-stranded DNA viruses. She was a 28-year-old Ph.D. candidate whose own genotype specified high cheekbones, hazel eyes, brown hair, long Barbie-doll legs, and hubba hubba hubba.
Shifted reading frames, the area of her research, refers to the way more than one meaning can be carried in a text sequence depending on how you break it into words. For example, in English, the sequence NOTIME can be read “No time” or “not I, me.” The DNA alphabet is composed of codons, sequences of three nucleotides. A DNA sequence — for example, TTACGGATTACCC — can be broken into codons in exactly three ways, depending on where you start reading. Early on, scientists discovered codons whose function was to signal where reading was to start and stop. The belief was that the DNA between any start-stop codon pair was read linearly, much like sentences in a book. In fact, DNA sometimes interprets the sequence between start and stop pairs differently, so that, in effect, DNA equivalents of NOTIME sometime mean one thing and sometimes mean something else.
Before this austere trick was sussed out, the thinking was that DNA was essentially obvious, in the sense that one gene mapped to one protein. It was believed that the number of genes in a sequence of DNA could be counted by tabulating the number of “start reading, stop reading” codon pairs, and that the number of genes in an organism’s genotype equaled the number of proteins in that organism. Figuring out what a gene did, on this model, would be like looking up a part number on Amazon.com. Just as each item that Amazon sells has a unique “ASIN” number, so it was thought that there was a one-to-one correspondence between genes and proteins.
But then what was one to make of the fact that the bacteriophage phi X174 had nine genes and 11 proteins? 9 = 11? Or as we would say nowadays, WTF?
Then in the early 1970s a scientist named Frederick Sanger had an epiphany — for which he would be awarded the Nobel Prize (his second) in 1980. “Holy cow!” Sanger said (I’m paraphrasing), “Mother Nature has been yanking our chain!” Turns out that DNA was doing some pretty damn fancy self-modifying recursive hacks, by which it was packing all kinds of meta-information into the plaintext. By now, 2003, everybody knows that DNA is devious, flirtatious and duplicitous. But in 1978 this was still shocking news. The discovery of shifted reading frames was like finding out that your mother had been having an affair for the last 15 years. Or that’s how it seemed to me, in any event.
But I may be anthropomorphizing: harboring jealousy, a quarter century later, against a molecular archetype. For if you want to know the truth, DNA was the only thing that my gorgeous so-called girlfriend ever thought about. Ever. Even in our most intimate moments she was thinking about the tricks, subtleties, sleights of hand, misdirections, camouflages and bad faith of DNA. It was as if she were Sherlock Holmes and DNA were Dr. Moriarty. Their relationship was all-consuming, adversarial and to the death. My role was incidental, and biological, like food or a blanket.
Yet so smitten was I, so in awe of her nonchalance in the domain of the nucleotides, that I resolved to learn the code myself. I read textbooks, I sneaked into lectures, I lurked about laboratories. And I read scientific papers that she gave me, and that I barely understood, as I popped Jiffy Pop popcorn over a Bunsen burner at 3 in the morning while she was in the cold room packing a column — or at the next bench running two-dimensional isoelectric-focusing slab electrophoresis gels in preparation for a talk she was to give at Cold Spring Harbor Laboratory that summer.
Oh my God, to be in that laboratory! I pity anyone who has not watched a hot chick, with her hair in a bun, wearing a white lab coat, perform science in the middle of the night. Even her mistakes were sublime: One night while doing some assay she accidentally poked her wrist with a hypodermic full of some nasty mutagen. “When I die of cancer in the name of science 30 years from now,” she said without slowing down or looking up, “you can say you saw it happen.”
I knew then that I was on hallowed ground, in the very presence of Science with a capital “S,” privileged to watch something as selfless, sacred and filled with mortal peril as the first wave landing on Omaha beach. I felt awe, as if Marie Curie were speaking directly to me from some ethereal realm. I felt as if I were popping popcorn for Galileo.
She gave that talk at Cold Spring Harbor; it went well, and then she left for London or Zurich, some lover or husband, it was all so long ago, I cannot remember. She later told me that James Watson and Francis Crick were in the audience when she presented her paper and that Watson complimented her afterward. Whatever the merits of her research, I have no doubt that Watson was more impressed by her ass than her assays. But in any event one does not present papers to the popes of DNA unless one knows whereof one speaks: That chick was the real deal.
Twenty years later I wrote a novel about molecular biology, largely inspired by memories of time spent in that beautiful geneticist’s lab. We’ll come back to her.
Three years ago, the privately held company Celera appeared on the verge of beating the combined scientific teams of the rest of the world to the goal of sequencing the human genome. Celera’s approach was less rigorous but faster than the Human Genome Project’s approach, and for a very understandable reason: Celera’s goal was not to advance science but to win the race by any means fair or foul and thereby claim what would have been the most astonishing conquistadorial prize in human history. For had Celera won the race to sequence the genome, and had it filed patents aggressively, it is conceivable that one tiny company could have laid claim to royalties on virtually all medical progress thenceforward. Nay, they could have claimed proprietary interest in the evolutionary future of the human race.
Never mind that the proposition was more ludicrous, on the face of it, than a private company’s laying claim to the moon. The threat was real, and scientists were scared.
This state of affairs was remedied by the heroic efforts of a once obscure University of California at Santa Cruz biology graduate student named Jim Kent, who, over the course of 40 days of coding so furiously that he literally had to soak his wrists in ice baths every night, wrote a program to assemble and make public the Human Genome Project’s own map. He completed the task one day ahead of Celera.
Kent’s stealth attack thereby beat Celera at its own game virtually single-handedly, in a feat that deserves to become as iconic as Watson and Crick’s.
I spoke with Kent, a genial fellow with wild hair and a big black beard, in the lobby bar of the San Diego Westin shortly after he gave his own O’Reilly keynote talk. Earlier that day he had received the inaugural Ben Franklin Award from Bioinformatics.Org, a nonprofit group whose mission is to promote open biology and defend the free flow of scientific information in an increasingly corporatized world, where even universities are suspect. (Ben Franklin declined to patent his inventions.)
In an age when biology is increasingly becoming a subdiscipline of informationtology, I wanted to find out how a computer scientist’s thinking about DNA differed from the traditional biologist’s. Unlike my long-ago girlfriend — who wouldn’t have known an operating system from a system operator — Kent was a bona fide geek. Before starting his career in biology he had founded and run a software company, sold it to a major corporation, and led a large team of software engineers at the new parent. By contrast, his biological background had been so weak that the path from his career in software to his eventual Ph.D. in biology began at the local community college.
Kent spoke to me in nerdspeak, with geekoid locutions such as the use of “build” as a noun: “That’s the most recent build of the genome. Build 31.” I was used to hearing biologists talking about the elegance of DNA with what might be called reverence. By contrast Kent spoke of DNA as if it were the most convoluted, ill-documented, haphazardly maintained spaghetti code — not God’s most sublime handiwork, but some hack’s kludge riddled with countless generations of side effects, and “parasites on parasites.”
“It’s a massive system to reverse-engineer,” he said. “DNA is machine code. Genes are assembler, proteins are higher-level languages like C, cells are like processes … the analogy breaks down at the margins but offers useful insights.” It was nearly impossible to tell the working code from cruft, Kent said. “That’s why a lot of people say, ‘The genome is junk.’” But that’s what he found interesting: a high-quality programmer’s code is always self-evident, but legacy assembler handed down from generation to generation of bricoleurs (I’m paraphrasing again) provides a real challenge for people who like puzzles.
Kent’s current interest is comparative genomics. In grossly simplified terms, a gene can be defined as a section of DNA that codes for a protein. Well, a lot of DNA in any organism does not code for a protein. So what’s it there for? Is it parasitical, accidental? Is it structural, there to determine the 3-D shape of the DNA molecule, like nucleotidal modeling clay? Who knows? And how does one determine what is or isn’t a gene in the first place — especially given what we know about DNA’s propensity for practical jokes?
The idea behind comparative genomics is that if the same nucleotide sequence exists in fish, mouse, human, whatever, then there’s reason to believe it might be a gene — on the hypothesis that useful stuff is more likely to get inherited than is random crap. It’s an idea as old as the Rosetta Stone, and Darwin.
Because we’re human, and because we want to cure human diseases that are caused by genetic fuckups manifested as wrong proteins, we want to know how human proteins work within us. We learn how a human protein works by understanding how a homologous protein works in E. coli, or a mouse or, Lord knows, a zebrafish. “The link right now is mouse-human,” in Kent’s words. “Next, rat-human. Eventually, all mammals.” So if you want to understand the whole human physiological system, in other words, the first step is to be able to predict what parts of our DNA sequence are genes. And to do that, you need to know what our DNA sequence is. That’s what the whole race to the genome was all about.
“We couldn’t do gene prediction until we had an assembly. Now we have an assembly, but figuring out where the genes are is an NP-hard problem. It doesn’t work so well when you have 100K parts.” He laughed. “Twenty-five percent of the genome has to do with self-regulation. It’s absurdly recursive. DNA is a very strange parallel processor…”
So we talked about the idea of DNA’s self-meta-programming, sort of doing reentrant microcode on its own bad self. This was all fascinating, but it wasn’t getting me any closer to moral punditry. So I changed tack and asked him if all this investigation wasn’t perhaps some kind of giant mistake that would lead to disaster. Wasn’t all this gene manipulation just wrong, in some sense? I suppose I was trying to be provocative.
But Kent agreed with me.
“It’s pretty clear that’s what’s happening to plants is wrong on many levels,” he said, and cited the notorious case of a “retracted” article in Nature about migration of plant genes, which many people cite as prima facie evidence of the whoredom of modern science to corporate money. Doctoring food crops for the enrichment of agribusinesses and to the detriment of peasants was biologically suspect, ecologically dangerous, politically short-sighted and morally wrong, Kent said, with a wry smile. “We should think hard before we release genes into the wild…”
If that’s true about plants, I said, what about tinkering with human genes? Well, he said, clearly the trend was to find a way to understand the path “from linear DNA to 3-D person.”
“Very slowly — or maybe not so slowly — this stuff will evolve. But look how long we’ve known about the cystic fibrosis gene. Yet we can’t use our knowledge to cure the disease. As you scale down, friction becomes insurmountable.” Nevertheless, given the rate at which seemingly insurmountable obstacles were being dismantled, he guessed it would be only 20 years before real designer babies would appear.
And yet, eventually, “the genes you’re born with are going to become moot. People will want the benefits for themselves, not just their children.” In other words, I asked, in the future, whenever some new genetic patch for better vision, or resistance to heart disease, or phenomenal ability to cook French cuisine comes on the market, people who can afford it will dope themselves with it much as they now download patches from Microsoft to dope their home PC? Basically, yes, he said; that was his guess.
So narcissism would become even more pronounced, infinitely more pronounced than it is now? Wasn’t that a disturbing prospect? He agreed that it was. But it was the necessary path for medical progress, and anyway, an open genome was better than a genome owned by a corporation.
“I feel, in an abstract way, good that the genome is out there for everybody.” And I supposed that in an abstract way I felt good about it also. But the world I live in isn’t always so abstract. I noticed myself wringing my hands.
In April 2000, the loosely organized anti-biotechnology movement known as Biodevastation came to Boston to protest genetically modified crops, corporate usurpation of genetic heritages, eugenics, nanotechnology, consumerist attitudes toward nature, and basically the notion that any good at all can come from having learned how DNA works. Two days of workshops and lectures at Northeastern University culminated with a rally on Copley Square on a beautiful spring day, and bio-Luddites of all stripes — including many distinctly hippie-looking stripes — held posters in favor of organic food and against the patenting of genes, while giant black puppets acted out a morality play that evidently had something to do with a Monsanto-Mordor connection.
There were plenty of police around, and lots of television news vans. On the TV news that night there was much sensational prattle about how “another Seattle” had been miraculously averted, but in truth the Biodevastation crowd was about as peaceful as they come. Unsurprisingly, none of the news reports I saw showed the least understanding of what the protests were all about.
Instead they showed clips of barefoot people dancing to drums in the park juxtaposed with clips of articulate spokespersons from the Biotechnology Conference lamenting the pathetic ignorance of their well-intentioned adversaries. The implication given by the corporate news shows was that Biodevastation was the equivalent of a Flat Earth Society convention, that the protesters lacked the most rudimentary understanding of the principles of scientific inquiry, to say nothing of DNA polymerase or protein chemistry.
In fact, although there was indeed a sizable proportion of people at Biodevastation who wouldn’t know a nucleotide from a nuclear submarine, I found that the workshop leaders were knowledgeable and persuasive. While some speakers were organic farmers with a mystical rather than scientific worldview — whose argument boiled down to “nature good; technology bad” — in general the discussion was much more nuanced.
The basic point being made by Biodevastation was not (as the corporate media implied) that science in itself was bad. It was that the biotechnological agenda was being set by a hubristic elite whose motives were, at the least, worthy of close scrutiny. How could ordinary people, the public, be confident that engineered genes would not escape their vectors? That’s what Biodevastators wanted to know. What were the long-term implications of a patent regime that rewarded companies for laying claim to genetic patterns that they did not create, but only “read”? And what were, in the long run, the implications of fostering a society where the worth of any person could be inferred from a quick look at her phenotype?
I had some amount of sympathy with the whole Biodevastation program, but mainly I was there to sell copies of my bioinformatic nanonovel, “Acts of the Apostles,” which I did from a bench near where Biodevastation Central had set up lunch. As you might expect, lunch was organic and vegetarian, and amid the peasant dresses, bare feet, guitars, anti-consumerism and attitude of defiance to the establishment I did experience a wee ’60s flashback, which I kinda liked. But mainly I was trying to sell books.
But as at the O’Reilly conferences, I found the people friendly and engaging; in fact, many of them were deeply thoughtful, but thoughtful in a different way from informationtologists. Successful bioinformaticians think with Boolean precision about things that take place in a second, things that are so small you need an electron microscope to see them. Successful organic farmers think intuitively, imprecisely, about things that happen over decades, that cover tens of acres. I did not agree with everything they said, but certainly I supported their resistance to corporate hegemony. So I bought a T-shirt for my wife.
My wife herself is a retrograde Luddite who doesn’t have a very high opinion of molecular biologists. “Those DNA manipulators are out of their minds,” she says. “They have no idea what they’re unleashing and they couldn’t care less. They terrify me.” My wife sometimes seems the very caricature of the anti-scientific ignoramus. When I try to tell her about the latest human gene that has been published on the Internet, she puts her fingers in her ears and hums.
But here’s another fact: My bio-Luddite wife, she who is so terrified of the DNA manipulators, who thinks they are out of their minds, is indeed the same hot chick who 25 years ago presented her research on multiple reading frames in single-stranded DNA viruses at Cold Spring Harbor Laboratories. That was in 1978, three years before our genetically imperfect daughter was born.
Yes, my wife, that woman over there sucking down the certified-organic fruit smoothie with her Unitarian pagan cohorts, is the phenomenological sequel to the woman I once saw pack a bikini for a virologists’ conference in Hawaii. But my wife the ex-scientist doesn’t go to scientific conferences anymore. She hasn’t done bench work in a molecular biology laboratory since shortly after our disabled son was born in 1983. She’s been too busy watching our children’s backs.
Prom night at the Perkins School for the Blind, in Watertown, Mass., is in many respects just like prom night anywhere else. Kids get dressed up, they listen to loud music, they dance. They think about graduation and they dream and worry about the future.
But in other respects, prom night at Perkins is unlike prom nights at most other high schools across the land. For one thing, most of the students are blind. Those who are sighted have poor vision. But virtually no Perkins students have blindness as their only handicap. Helen Keller, the school’s most famous alumna, was deaf and blind and had lost years of prime learning time before the miraculous intervention of Anne Sullivan, her nearly blind Perkins teacher. Helen Keller would fit in perfectly at Perkins today. To these kids, mere blindness is almost trivial.
It’s hard to pigeonhole Perkins students. Some of the students use wheelchairs or walkers, but others are athletes. Some have autism, many have mental retardation, but some score 800 on their college boards. Some look like Tom Cruise or Nicole Kidman, others have facial deformities. Some are gifted musicians; others are deaf. And once in a while there is a gifted deaf musician. Some of the seniors, upon graduation, will be going off to college; others will take up residence in a group home or move back in with their parents. Some listen to Eminem, others to Randy Travis, and some have tastes that are totally inscrutable.
But the one thing all Perkins students have in common is that they have lived the trauma of being different. They are mutants and they know it. And some, the more sophisticated among them, know that had prenatal genetics tests for their conditions existed 15 or 20 years ago, they wouldn’t even be here.
On prom night last spring I spoke with several Perkins students about this whole topic. I particularly remember one student named Jakob. He’s a 20-year-old junior who has several profound disabilities and who has been in and out of hospitals much of his life. Like Helen Keller and like many of his own friends, he has had several brushes with death — most of them in his first 10 years. He has had eye surgery, experimental drugs that nearly killed him, and emergency brain surgery at 6 o’clock one morning to keep his head from exploding. He has the kinds of conditions that lead people to choose abortion rather than carry to term damaged fetuses — sometimes after using a “quality of life” calculus that is beyond my comprehension.
He’s a smart but socially awkward guy, traumatized by 10 years of isolation and ostracism in mainstream schools but now perfectly at ease at Perkins, who reads fantasy books by the truckload — holding them 2 inches from his face until his eyes fatigue and turn inward.
I asked Jakob how he felt about the idea that, thanks to the Human Genome Project, in the future, kids like him and his friends need never be born. (Although Jakob’s disabilities were caused by prenatal infection and not by a genetic abnormality, that’s academic. He well knows that he’s just as much a mutant as any of his friends.) He thought for a few seconds, and then said in a voice dripping with irony, “Hello, justifiable holocaust.”
Jakob, by the way, is my son.
Later I spoke with Jane, a “program aide” at the Perkins School. She’s a 22-year-old woman with a resemblance to Gwyneth Paltrow who was nearing the end of her third year working the late shift in the Lower School. There her job was to help blind, sick, crippled and often enraged children learn “life skills.” Skills like eating their meals, dealing with their back and leg braces, changing their tampons, accepting their fate — which sometimes was that they were dying and their parents had abandoned them — and singing themselves to sleep.
Jane had been a student at Bennington College before dropping out and coming to work at Perkins. Her high school career had involved good grades in honors classes and lots of activism and leadership. Among dozens of other activities she founded the local chapter of Amnesty International, and upon graduation she had been awarded the Eleanor Roosevelt Prize of the Stone Soup Foundation. In fact, although Jane has blond hair and blue eyes, the local chapter of the NAACP gave her its most prestigious stipend in recognition of her work on behalf of oppressed people everywhere. That was but one of her seven scholarships. Funding agencies were tripping over each other to pay her college bills. They couldn’t get enough of this working-class kid who was so passionate about justice, so hardworking, so funny, so friendly and so smart.
Jane’s college career went on hold when she went insane. After experiencing the crescendo of a classic mania that astonished all who witnessed it — featuring giving away everything she owned, going without sleep for 10 days, having a long intimate conversation with God, and writing a grandiose but illegible manifesto in permanent marker on dozens of square yards of freshly painted dormitory walls — she spent the next month in the psychiatric ward of Massachusetts General Hospital. Even when filled with enough tranquilizers to subdue an elephant she stayed awake another four days, talking of miracles.
Jane has bipolar disorder, a genetic mental illness with a 20 percent mortality rate, more deadly than many cancers. Her disease had been in remission for years when I spoke to her for this story, but it is always lurking. Even before Jane was discharged from the hospital, people were telling her to forget about having children, because any of her children would be as fucked up as she was and thus clearly not worth having. So I asked her how she felt about that and about the high likelihood that genetic conditions like hers would be detectable soon by cheap genetic tests, that bipolar nut jobs like her would disappear from the earth because fetuses with her markers would be aborted as not worth having.
“Oh, it makes me a little bit angry,” she said. “Because, yes, I’m fucked up, but I’m not that much more fucked up than the average person. So it seems unfair. But really? It makes me sad. It makes me so sad to know that I’m not considered good enough to be a mother and to think that maybe they’re right, that people like me shouldn’t be born. Being a mutant is a lonely condition. It just makes me lonely and sad.”
Jane, by the way, is my daughter.
- – - – - – - – - – - -
End of Part 1.
"Bet Me" by Jennifer Crusie
A contemporary romantic comedy set to Elvis Costello and lots of luxurious and sinful sugary treats. Read the whole essay.
"Welcome to Temptation" by Jennifer Crusie
Another of Crusie's romantic comedies, this one in the shadow of an ostentatiously phallic water tower. Read the whole essay.
"A Gentleman Undone" by Cecilia Grant
A Regency romance with beautifully broken people and some seriously steamy sex. Read the whole essay.
"Black Silk" by Judith Ivory
A beautifully written, exquisitely slow-building Regency; the plot is centered on a box with some very curious images, as Edward Gorey might say. Read the whole essay.
"For My Lady's Heart" by Laura Kinsale
A medieval romance, the period piece functions much like a dystopia, with the courageous lady and noble knight struggling to find happiness despite the authoritarian society. Read the whole essay.
"Sweet Disorder" by Rose Lerner
A Regency that uses the limitations on women of the time to good effect; the main character is poor and needs to sell her vote ... or rather her husband's vote. But to sell it, she needs to get a husband first ... Read the whole essay.
"Frenemy of the People" by Nora Olsen
Clarissa is sitting at an awards banquet when she suddenly realizes she likes pictures of Kimye for both Kim and Kanye and she is totally bi. So she texts to all her friends, "I am totally bi!" Drama and romance ensue ... but not quite with who she expects. I got an advanced copy of this YA lesbian romance, and I’d urge folks to reserve a copy; it’s a delight. Read the whole essay.
"The Slightest Provocation" by Pam Rosenthal
A separated couple works to reconcile against a background of political intrigue; sort of "His Gal Friday" as a spy novel set in the Regency. Read the whole essay.
"Again" by Kathleen Gilles Seidel
Set among workers on a period soap opera, it manages to be contemporary and historical both at the same time. Read the whole essay.