Bugged out

"The Bug" author Ellen Ullman talks about the Gothic terrors that lurk between the rational lines of computer code.

Published May 16, 2003 7:00PM (EDT)

When C.P. Snow first identified the rift between "two cultures" of scientists and literary intellectuals who could neither understand nor communicate with each other, it was 1959, and computing was in its infancy. Today, in many fields, the denizens of Snow's two cultures are reaching across the gap. But computer programming remains a vast unknown country to most outsiders -- even as more and more of our work and our culture stands on its foundation.

Ellen Ullman has lived in that country, and ever since the 1997 publication of her memoir, "Close to the Machine: Technophilia and Its Discontents," she has meticulously and articulately chronicled its customs and dangers. An English major turned programmer turned writer, she has a knack for talking about the experience of writing code -- "thought that works," as one of her characters puts it -- in ways that nonprogrammers can understand. She does so without glorifying the creators of software, as scribes of the Internet boom would; but she doesn't trivialize their work or their lives, either. She does full justice to the highs and lows of the programming life -- as both an unnatural punishment for the human organism and an invigorating challenge to the human brain.

"The Bug," Ullman's new novel, tells the tale of Ethan Levin, a programmer bedeviled by an elusive, infuriating bug that resists every attempt to corner it. As the hunt for this software nemesis grows fiercer, Levin's personal life unravels.

I talked with Ullman at her writing studio in a 1912 office building, just south of Market Street in San Francisco.

"The Bug" started as nonfiction. How did it evolve into a novel?

After I finished "Close to the Machine," I thought, well, what now? The most direct route for me seemed to be to collect some essays I'd already published. To complete that, I thought I'd write a novella-length essay about a bug I'd had when I was a programmer in the mid-'80s -- a bug that drove me crazy. It came and went. I could not figure out what was causing it. No one could reproduce the set of circumstances that actually led to it.

In "The Bug," the programmer, Ethan Levin, scribbles down the evocative phrase, "Cannot reproduce."

If you can't reproduce a bug, the suspicion is it isn't there. That's one of the programmer's defenses: I can't reproduce this, so what are you talking about? But it was clear there was a bug. Out of the blue, the whole front end would just die. It was clearly my code, and I could not figure out what the cause of it was. And it really caused me despair. The rest of my work was going well enough. I was on schedule.

This was when you worked at Sybase?

Yes. I think I was the only engineer in the company on schedule, according to my boss. And so on the face of it things were not going badly. But it gnawed at me, because I knew it was lying in wait in there. There was something wrong at the heart of this system.

So I thought, that might make an interesting essay, a way for people who don't write software to understand the debugging process. To see how writing software is more or less equivalent to debugging. You have an idea of what you have to do, and you write some code, and then you have to get it past the compiler, and the compiler simply certifies that it's syntactically correct. But is the program doing what it's supposed to do? On the first pass, almost never. It's a reiterative process. You don't expect that you'll write your code, run it through the compiler, and everything will work.

People do write smaller, more object-oriented code now, and it's easier to write a small chunk of code -- but then it's harder to test how it will be when it all gets together. So there's really no escaping the debugging process.

I think the history of software engineering is trying to find a way to write reliable code and miserably failing, year after year. It wouldn't be a perennial topic in software engineering if anyone had any idea how to fix the problem of bugs.

But when I sat down to write, I really didn't get anywhere. It seemed too relentlessly techy. Then, when I tried to personalize it, I found I didn't want to write about my own life at that juncture. I didn't think it was that interesting. I'd already written a memoir. I thought, how much more of me do people really need?

So I thought, OK, I'll see if I can turn it into fiction. And that was a difficult decision, because I learned that fiction and essays really have very little in common. The impulse of an essay is to explain, somehow, even if it's only to yourself; but the impulse of fiction is to dramatize. I really had no idea how to write a novel.

Had you written fiction before?

I had. Lots of competently ordinary short stories that I'd sent away -- this was while I was programming, in the mid-'80s. I'd get nice rejections or form-letter rejections. I stopped doing that because I knew they weren't particularly good. I do have a novel in the drawer -- very bad, very long, 760-some-page manuscript. Three or four people have read it, to their detriment.

Was it about programming?

No, no, no, no, no. I thought of writing as a way to get away from programming. Programming to me was a job. It wasn't a mission. It didn't occur to me at that point, in the mid-'80s, to write about computer programming, which at that point was not a social and public phenomenon. It was what we used to call a back-room operation. It literally was back there with the general ledgers. At that point the world was not exactly waiting with bated breath, "Oh, what's going on in the back room with the computer?"

What led you to set the main part of your story back in the 1980s?

A couple of reasons. One, trying to write about new computer things, if you're going to spend years on it, like you do on a novel, it's pretty hopeless. By the time you're done with it, it won't be new anymore. It's not journalism. This book took me five years. If you tried to write it about breaking developments, it's going to look old no matter what. You can't keep updating; it's too integral a story. You can't just flip pieces in and out of it -- it's not object-oriented.

In "Close to the Machine," I had mentioned a particular version of C++ -- it was like, 4.2, and of course by the time it was out, I don't know, 6.2 was out. Someone said to me, well, you shouldn't have put the version number in there; when the book came out, it was like, oh, that old thing! It stuck in my mind, that comment: Don't bother. You're never going to stay at the leading edge of a curve with a novel.

That was one part of it. The other was, this is when it happened to me. The bug that had happened to me happened at a particular time, in a particular technical environment.

I wanted to write a historical, technical, Gothic mystery. I said to myself, I want a historic Gothic mystery in a technical setting. And that was a distinct decision -- because technology is not new anymore. We are really going on at least a third generation now of programmers, in this modern era of programming. And I wanted some kind of institutional memory, I guess. One more consideration: I thought, if I set it back then, without machines available, I wouldn't be forced to write all the code.

You wouldn't feel obligated?

It's not only obligation. When I wrote the code samples that are in "The Bug." I sat and then I thought, well, what would these connect to? And there I was, sitting at the MacDowell Colony, ostensibly writing the novel, and whole days would go by when I was just writing code. I actually had a little compiler on my laptop. Finally I thought, this is really a bad idea -- code really eats up your time. I thought, I'll never write a novel if I set something where I actually could write the companion code. So I thought it would be better to set it in a time with an old version of the C language -- a whole technical environment that had come and gone.

These days so much of coding involves using packages or other layers of code that have already been written. At that time, in the mid-'80s, you really did have to write everything yourself. There were some things we slurped in, minimal stuff, the usual packages available on Unix, like termcap or terminfo. But the rest of it, you had to write it all yourself, there wasn't Motif, there wasn't Windows; if you wanted a windowing system, you had to start by figuring out, how do you interact with all these devices? And how do you abstract all the devices? None of that existed at the time, so I thought it would be good to remember that.

By the end of "The Bug," your reader comes to understand that Ethan Levin's bug, which gets nicknamed "the Jester" and seems almost supernatural in its elusiveness, is comprehensible -- it has an explanation. Was it the same with the real-life bug that inspired the story?

I don't even know whether I should tell you this. Actually, I thought I had fixed my bug. But then I sat down to write this book and I tried to remember the bug and how I'd fixed it, and I looked at it, and I thought, that could not possibly have fixed that bug!

So you think, in retrospect, that you didn't actually fix it?

The bug disappeared. It never recurred. Whatever I did, I don't know if that fixed it, it then disappeared. It might have been a memory leak, it might have been any number of things. Maybe I did fix it but in ways that are now mysterious to me. I almost changed the end of the book when I realized this -- but I thought, no, it would be way too unsatisfying to a reader to just have it disappear and never recur, without a solution. But that happens in real life.

Of course it doesn't matter: That software was in use for about 10 years, and it's been superseded -- no one out there has to worry! If they're still using that software after all this time, they have other problems. But that was an awful thing, to sit back and think, you know, I never did fix that bug.

One of your themes is that, in the end, the mystery is something we project onto the machine -- that however complex the computer is, if you go down far enough, it is all just the ones and zeroes, the hardware and its states, it's all stuff that's created by human beings.

But the counter story is that Ethan is working on a "game of life" simulation. And this notion that complexity can produce something -- you used the word "supernatural," or seemingly mysterious, having what seems like its own will -- this is the current thinking on consciousness, on sentience, on almost everything. That you may understand the little pieces, but putting them together is where things come alive. Yes, if you go down to the very bottom, it's understandable; but the ways in which all these little bottom pieces interact is not understandable, necessarily. Then the question is, is that how we really are? Or is that a way of simulating us?

I wanted to have both points of view in there. Computers as we currently know them are not alive. But then the question is, how does something become alive, or apparently become alive? That's unsolved, and we don't know it. There are many theories. The one that scientists are currently leaning on is chaos theory, or emergence. It isn't just a machine that, if you sit patiently, you'll understand it all. At some point there may be enough complexity that you really can't understand it.

And that, I guess, is where the "Gothic" comes in. What did the term mean to you when you said that was what you wanted to write?

Dark. Suspenseful. I really did feel like I wanted to write something that had the kind of whodunit energy of a murder story, without having a dead body or anything like that. A little overwrought. But in a way you can hopefully get away with.

A reference to "Middlemarch" plays a small but significant part in the story. Did you have any specific literary models in mind?

There is a sort of game of "catch the literary references" in there, which I did for myself. There was one scene, that I took out, that's sort of a reenactment of the opening scene of "The Magic Mountain," and there's a quote from that book at the beginning of "The Bug." Obviously, Mary Shelley's "Frankenstein." These are not literary so much as intellectual sources. There's also, if you can find it, something from "Madame Bovary." Obviously Kafka -- the bug -- and "Middlemarch."

I wish I could say there was a unity of literary influence on it. It took me so long to write -- it was narrated one way, then another, and it took me a long time to find the characters. I'm working on a second book, and thank goodness, this one is starting with characters and a situation -- I don't have an idea at all.

It's a novel?

Yes. Starting a novel with an idea is, I think, a difficult burden. It's much better to have the characters, just stick with them and they'll tell you what has to happen.

I had the feeling that both Ethan, the programmer with the bug, and Berta, the tester who first finds it and later joins the hunt, are aspects of yourself, representing different parts of your experience and psyche.

Absolutely right. I once took an acting class and what they taught you there was, don't make a character from the outside -- find something in you and then make it bigger, round it out. So each of these characters is me in a way, taking some aspect of me and hopefully rounding it out. It took me a while to understand that Ethan really was me. Originally he was much more of a cardboard cutout guy programmer. And then I had to remember that I was the one who'd had this bug, and invest him with the fears and the hopes and the insecurities that I'd had, or anyone might have, in this position. And then there are things about him that are absolutely not like me, that are picked up from other people I've known, other engineers I've worked with.

In many ways Ethan fits the image of the programmer who has a hard time dealing with people. Do you think programming attracts people who have trouble handling human interactions, or does writing software make programmers more antisocial?

It's partially a self-selecting group. People who spend all their time with a compiler and a debugger have to be emotionally suited for that role. You won't last very long if you don't find that satisfying. This is a stereotype; like all stereotypes, it has truth in it, that programmers are a group who are much better at machines than they are at people. Of course, I've known many fine software engineers who are wonderfully well-rounded people. But in general it is true, programmers I've known are people who have very formal ways of thinking, very formal ways of interacting, don't seem to read facial and bodily cues all that well. And then of course the work reinforces that. You get rewarded not for how well you play with others but how well your code works.

Now there is a new development that's changing this. I think that the open source movement is a way for coding to be a social act. It's social in the way programmers are social; there's a lot of ego involved in it, showing off your work and getting recognition for it, and I think that's fine. The most promising thing about it, in addition to having open source, is that the practitioners now have a social way to interact around programming.

The way it's been before is that everyone's deadlines are so absurd, and there's so much work to do, that you're kind of like two people sitting side by side each digging a separate tunnel, or separate well. We have this notion of a programming dummy: You go to your colleague and you describe your problem, and they just sit there, and they nod, and nod -- that's why it's called programming dummy. And in the process of explaining your problem to this other person, you figure it out and you go, "Ah, great, thank you," and run away and fix it.

There are code reviews. But for the most part it's very difficult even to talk in a detailed way with your very own colleagues about the work you're doing, it's so solitary. So open source is a way to have at least a community of programmers who can talk about the same code. And that, I think, is an amazing change in the whole way that software gets produced.

There's a moving moment in "The Bug" when Berta imagines a happier kind of working relationship with Ethan, as "two people bonded by collaboration, programming as human glue, not the private obsession it seemed to be." And of course today there's the "extreme programming" movement, part of which involves programmers working in pairs.

Right, and also intense interactions with the users. These are ways to break out of that box, that isolation. And I think that's really positive -- adding human interaction into the process of writing software.

Still, programmers are a self-selecting group, for the most part -- it's people who, you know, are not going to become politicians, for instance. It's interesting that there's so much hatred, or derision, of the political process among technical people, software engineers. Politics is so messy and full of half-truths, ambiguities, compromises. It's a mess, right?

There was a brief moment of technocratic hubris at the height of the Internet boom, when some of the people who were running software companies felt they should tell the rest of the world how to run their governments, schools, everything.

Well, god help us if government gets run like a software project, you know? Or maybe it does, already. Because in the end, a real software project is a very messy endeavor. Engineering is about trade-offs. Every software project is almost like a bill in Congress: everyone has had a hand in it, there are the marketing people, the prospective customers, the existing users, the engineers themselves, product managers -- everybody's pulling in some direction.

It's a messy process. Which many of us find objectionable, actually. When you're in the middle of it you think, it doesn't have to be this way! That's why the character Harry Minor, Ethan's boss in "The Bug," is one of my favorites. He says things like, "Well, you only get one chance to do it right the first time. Screw the VCs -- we want to write good code!"

He's the project manager you wish you had.

But of course people like him get eaten up. You have to be able to bend. You have to be able to say, OK, this is what's possible. Here's a schedule that you actually can meet. What are the essential things you have to have. Just do those first.

Another great character is Wallis, the scholar who runs the testing group. She gets fed up at one point and explodes, "They call it computer science, but it's not science! It's just a bunch of people trying to build something before they're overcome by stupid, senseless, mindless, moronic, idiotic, dumb mistakes!" This question -- how much of programming is really a science? -- seems to sit beneath a lot of your story.

Well, there is a point at which it's science, obviously. For the people who do the chip and the disk drive, there's a great deal of science. And not all of it is in hardware -- it's also in the logic in the hardware, microcode and so on. And operating systems, that's a little more toward science, though over time it just becomes a big piece of software. The closer you get to applications, the less it's science. Naturally, because that's about how this will be used in the world, and the world is a messy place.

So yes, there is science in it, most definitely -- if there weren't, we would not be seeing these tremendous advances in the capability of computers that we've been seeing. But we have not seen a corresponding advance in the ability to write software. And people have tried to turn that part of it into science for a long time. Top-down programming was supposed to solve it; object-oriented programming was supposed to solve it; various kinds of test techniques were supposed to solve it.

We've tried to find scientific ways to write code, and all of that has not proved successful. That's where the problem comes in. If computers were just hardware, it would be fine. But they're not. There are bugs in chips too, now.

Like the famous Pentium bug.

You get to a level of complexity in writing software where you just can't make it clear-cut, and you can't assure how it's going to work, or that it will work, or that it will work as expected. Everybody knows this, now. Anyone who uses a Microsoft operating system knows this. And it's not just Microsoft, though I think their code is particularly badly made.

It sure doesn't feel like science when you're doing it. It feels like being on the train and never being able to get off. One more compile, one more compile, one more compile.

It's pretty common nowadays to use the label "engineer" to describe almost anyone who writes software. But the term carries with it some promises of reliability that software doesn't always deliver.

"Engineer" used to mean people who were working kernel-level or lower -- it referred to a level of code one was writing. If you were writing in the application space you were a programmer, and if you were writing in the system space you were an engineer. And there was some merit to that distinction. The application space has now become so complicated mainly because of all the incredible energy that's gone into the user interface.

So the term "engineer" just began to leak across the divide. It reflects the desire to try to see it all as a scientific process. But it's right in one respect: You want to build something. Engineering implies a kind of tinkering, building mentality: Let's get to work, let's get it to work. It's not about designing the most elegant machine, it's not art. The goal of engineering is to build something that works. And I think that's very much the motive in programming.

It's been five years since "Close to the Machine," and we've seen a boom and a bust in that time. Has anything fundamental changed as a result?

Two big changes. There were a lot of people who watched the Internet gold rush from the beginning who said, "Gee, there's no business plan to make money here at any of these companies, and there never was, really." I think I'd like to write a business novel that begins, "All boom times are alike, but each recession is miserable in its own special fashion." There's a sense in which this could have been tulip-mania, this could have been the '20s.

I'm sorry that happened, because it has discredited a lot of the creative energy that could have been harnessed. We still don't really know how good the Web is, because people had crazy ideas that everyone will want to shop online, and somehow you should value an online grocer the way you would a chip company. That was silly. But that doesn't mean that the Web is a bad idea. And there are ways to make money on it -- just not grandiose ways.

So I regret the collapse of all that creative energy. What happened in that rush was what had happened during the '70s and '80s, when my crop of programmers came in. They weren't engineers; we weren't engineers. Everything was exploding, and the need for personnel was so great, that all these smart people were coming in, and they had all kinds of backgrounds, and they were young, and they were smart and energetic.

I hate what's happened -- a whole generation of programmers now in their 40s and 50s is out of work. There's a lot of institutional memory that's getting lost because of that. I can't tell you the number of people I know and who I respect in the industry who are now looking for work. You'd be shocked. These talents are now lost to us. Not just the individuals, but their contributions to the industry.

The other side of it is the political changes. Given where technology is now, the way it's moved into surveillance is alarming. I'm someone who started out working in databases, and to see what's happened to database technology, turning into the Total Information Awareness initiative, is alarming. And not only that, the movement toward biometric data, actual data about the body, this is a truly alarming development.

People trust what the computer tells them.

I think that the Department of Justice has just put out a departmental letter saying they no longer can assure the accuracy of the National Crime Information Center database. Databases are full of error -- in software, in the maintenance, in the records that got in there. This is the nightmare of identity theft: Once things go into an electronic file, trying to disprove it is very difficult. We've given these electronic records the aura of infallibility and truth.


By Scott Rosenberg

Salon co-founder Scott Rosenberg is director of MediaBugs.org. He is the author of "Say Everything" and Dreaming in Code and blogs at Wordyard.com.

MORE FROM Scott Rosenberg


Related Topics ------------------------------------------

Author Interviews Books