The dumbing-down of programming

Part Two: Returning to the source. Once knowledge disappears into code, how do we retrieve it?

Published May 13, 1998 7:00PM (EDT)

I used to pass by a large computer system with the feeling that it represented the summed-up knowledge of human beings. It reassured me to think of all those programs as a kind of library in which our understanding of the world was recorded in intricate and exquisite detail. I managed to hold onto this comforting belief even in the face of 20 years in the programming business, where I learned from the beginning what a hard time we programmers have in maintaining our own code, let alone understanding programs written and modified over years by untold numbers of other programmers. Programmers come and go; the core group that once understood the issues has written its code and moved on; new programmers have come, left their bit of understanding in the code and moved on in turn. Eventually, no one individual or group knows the full range of the problem behind the program, the solutions we chose, the ones we rejected and why.

Over time, the only representation of the original knowledge becomes the code itself, which by now is something we can run but not exactly understand. It has become a process, something we can operate but no longer rethink deeply. Even if you have the source code in front of you, there are limits to what a human reader can absorb from thousands of lines of text designed primarily to function, not to convey meaning. When knowledge passes into code, it changes state; like water turned to ice, it becomes a new thing, with new properties. We use it; but in a human sense we no longer know it.

The Year 2000 problem is an example on a vast scale of knowledge disappearing into code. And the soon-to-fail national air-traffic control system is but one stark instance of how computerized expertise can be lost. In March, the New York Times reported that IBM had told the Federal Aviation Administration that, come the millennium, the existing system would stop functioning reliably. IBM's advice was to completely replace the system because, they said, there was "no one left who understands the inner workings of the host computer."

No one left who understands. Air-traffic control systems, bookkeeping, drafting, circuit design, spelling, differential equations, assembly lines, ordering systems, network object communications, rocket launchers, atom-bomb silos, electric generators, operating systems, fuel injectors, CAT scans, air conditioners -- an exploding list of subjects, objects and processes rushing into code, which eventually will be left running without anyone left who understands them. A world full of things like mainframe computers, which we can use or throw away, with little choice in between. A world floating atop a sea of programs we've come to rely on but no longer truly understand or control. Code and forget; code and forget: programming as a collective exercise in incremental forgetting.

Every visual programming tool, every wizard, says to the programmer: No need for you to know this. What reassures the programmer -- what lulls an otherwise intelligent, knowledge-seeking individual into giving up the desire to know -- is the suggestion that the wizard is only taking care of things that are repetitive or boring. These are only tedious and mundane tasks, says the wizard, from which I will free you for better things. Why reinvent the wheel? Why should anyone ever again write code to put up a window or a menu? Use me and you will be more productive.

Productivity has always been the justification for the prepackaging of programming knowledge. But it is worth asking about the sort of productivity gains that come from the simplifications of click-and-drag. I once worked on a project in which a software product originally written for UNIX was being redesigned and implemented on Windows NT. Most of the programming team consisted of programmers who had great facility with Windows, Microsoft Visual C++ and the Foundation Classes. In no time at all, it seemed, they had generated many screenfuls of windows and toolbars and dialogs, all with connections to networks and data sources, thousands and thousands of lines of code. But when the inevitable difficulties of debugging came, they seemed at sea. In the face of the usual weird and unexplainable outcomes, they stood a bit agog. It was left to the UNIX-trained programmers to fix things. The UNIX team members were accustomed to having to know. Their view of programming as language-as-text gave them the patience to look slowly through the code. In the end, the overall "productivity" of the system, the fact that it came into being at all, was the handiwork not of tools that sought to make programming seem easy, but the work of engineers who had no fear of "hard."

And as prebuilt components accomplish larger and larger tasks, it is no longer only a question of putting up a window or a text box, but of an entire technical viewpoint encapsulated in a tool or component. No matter if, like Microsoft's definition of a software object, that viewpoint is haphazardly designed, verbose, buggy. The tool makes it look clean; the wizard hides bad engineering as well as complexity.

In the pretty, visual programming world, both the vendor and programmer can get lazy. The vendor doesn't have to work as hard at producing and committing itself to well-designed programming interfaces. And the programmer can stop thinking about the fundamentals of the system. We programmers can lay back and inherit the vendor's assumptions. We accept the structure of the universe implicit in the tool. We become dependent on the vendor. We let knowledge about difficulty and complexity come to reside not in us, but in the program we use to write programs.

No wizard can possibly banish all the difficulties, of course. Programming is still a tinkery art. The technical environment has become very complex -- we expect bits of programs running anywhere to communicate with bits of programs running anywhere else -- and it is impossible for any one individual to have deep and detailed knowledge about every niche. So a certain degree of specialization has always been needed. A certain amount of complexity-hiding is useful and inevitable.

Yet, when we allow complexity to be hidden and handled for us, we should at least notice what we're giving up. We risk becoming users of components, handlers of black boxes that don't open or don't seem worth opening. We risk becoming like auto mechanics: people who can't really fix things, who can only swap components. It's possible to let technology absorb what we know and then re-express it in intricate mechanisms -- parts and circuit boards and software objects -- mechanisms we can use but do not understand in crucial ways. This not-knowing is fine while everything works as we expected. But when something breaks or goes wrong or needs fundamental change, what will we do but stand a bit helpless in the face of our own creations?

Linux won't recognize my CD-ROM drive. I'm using what should be the right boot kernel, it's supposed to handle CD-ROMs like mine, but no: The operating system doesn't see anything at all on /dev/hdc. I try various arcane commands to the boot loader: still nothing. Finally I'm driven back to the HOW-TO FAQs and realize I should have started there. In just a few minutes, I find a FAQ that describes my problem in thorough and knowledgeable detail. Don't let anyone ever say that Linux is an unsupported operating system. Out there is a global militia of fearless engineers posting helpful information on the Internet: Linux is the best supported operating system in the world.

The problem is the way the CD-ROM is wired, and as I reach for the screwdriver and take the cover off the machine, I realize that this is exactly what I came for: to take off the covers. And this, I think, is what is driving so many engineers to Linux: to get their hands on the system again.

Now that I know that the CD-ROM drive should be attached as a master device on the secondary IDE connector of my orphaned motherboard -- now that I know this machine to the metal -- it occurs to me that Linux is a reaction to Microsoft's consumerization of the computer, to its cutesying and dumbing-down and bulletproofing behind dialog boxes. That Linux represents a desire to get back to UNIX before it was Hewlett-Packard's HP-UX or Sun's Solaris or IBM's AIX -- knowledge now owned by a corporation, released in unreadable binary form, so easy to install, so hard to uninstall. That this sudden movement to freeware and open source is our desire to revisit the idea that a professional engineer can and should be able to do the one thing that is most basic to our work: examine the source code, the actual program, the real and unvarnished representation of the system. I exaggerate only a little if I say that it is a reassertion of our dignity as humans working with mere machine; a return, quite literally, to the source.

In an ideal world, I would not have to choose between the extreme polarities of dialog box and source code. My dream system interface would allow me to start hesitantly, unschooled. Then, as I used the facility that distinguishes me from the machine -- the still-mysterious capacity to learn, the ability to do something the second time in a way quite different from the first -- I could descend a level to a smarter, quicker kind of "talk." I would want the interface to scale with me, to follow me as my interest deepened or waned. Down, I would say, and it would let me get my way, however stupid or incomprehensible this seemed to it, a mere program. Up, I could say, so I could try something new or forgotten or lost just now in a moment of my being human, nonlinear, unpredictable.

Once my installation of Linux was working, I felt myself qualified, as a bona fide Linux user, to attend a meeting of the Silicon Valley Linux User's Group. Linus Torvalds, author of the Linux kernel and local godhead, was scheduled to speak. The meeting was to be in a building in the sprawling campus of Cisco Systems. I was early; I took a seat in a nearly empty room that held exactly 200 chairs. By the time Torvalds arrived half an hour later, more than twice that many people had crowded in.

Torvalds is a witty and engaging speaker, but it was not his clever jokes that held the audience; he did not cheerlead or sell or sloganize. What he did was a sort of engineering design review. Immediately he made it clear that he wanted to talk about the problem he was just then working on: a symmetrical multiprocessing kernel for Linux. For an hour and a half, the audience was rapt as he outlined the trade-offs that go into writing an operating system that runs on multiple processors: better isolation between processes vs. performance; how many locks would be a good number, not too many to degrade response, not so few to risk one program stepping on the memory area of another; what speed of processor should you test on, since faster processors would tend to minimize lock contention; and so on through the many countervailing and contradictory demands on the operating system, all valid, no one solution addressing all.

An immense calm settled over the room. We were reminded that software engineering was not about right and wrong but only better and worse, solutions that solved some problems while ignoring or exacerbating others. That the machine that all the world seems to want to see as possessing some supreme power and intelligence was indeed intelligent, but only as we humans are: full of hedge and error, brilliance and backtrack and compromise. That we, each of us, could participate in this collaborative endeavor of creating the machine, to the extent we could, and to the extent we wished.

The next month, the speaker at the Silicon Valley Linux User's Group is Marc Andreessen, founder of Netscape. The day before, the source code for Netscape's browser had been released on the Internet, and Andreessen is here as part of the general celebration. The mood tonight is not cerebral. Andreessen is expansive, talks about the release of the source code as "a return to our roots on a personal level." Tom Paquin, manager of Mozilla, the organization created to manage the Netscape source code, is unabashed in his belief that free and open source can compete with the juggernaut Microsoft, with the giants Oracle and Sun. He almost seems to believe that Netscape's release of the source isn't an act of desperation against the onslaught of the Microsoft browser. "Technologists drive this industry," he says, whistling in the dark. "The conventional wisdom is it's all marketing, but it's not."

Outside, a bus is waiting to take the attendees up to San Francisco, where a big party is being held in a South of Market disco joint called the Sound Factory. There is a long line outside, backed up almost to the roadway of the Bay Bridge. Andreessen enters, and he is followed around by lights and cameras like a rock star. In all this celebration, for just this one night, it's almost possible to believe that technologists do indeed matter to technology, that marketing is not all, and all we have to do is get the code to the people who might understand it and we can reclaim our technical souls.

Meanwhile, Andreessen disappears into a crush of people, lights flash, a band plays loudly and engineers, mostly men, stand around holding beer bottles. Above us, projected onto a screen that is mostly ignored, is what looks like the Netscape browser source code. The red-blue-green guns on the color projector are not well focused. The code is too blurry, scrolling by too quickly, to be read.


By Ellen Ullman

Ellen Ullman is a software engineer. She is the author of "Close to the Machine: Technophilia and its Discontents."

MORE FROM Ellen Ullman


Related Topics ------------------------------------------

Linux