The dumbing-down of programming

Rebelling against Microsoft and its wizards, an engineer rediscovers the joys of difficult computing. First of two parts.

Topics: Linux,

Last month I committed an act of technical rebellion: I bought one operating system instead of another. On the surface, this may not seem like much, since an operating system is something that can seem inevitable. It’s there when you get your machine, some software from Microsoft, an ur-condition that can be upgraded but not undone. Yet the world is filled with operating systems, it turns out. And since I’ve always felt that a computer system is a significant statement about our relationship to the world — how we organize our understanding of it, how we want to interact with what we know, how we wish to project the whole notion of intelligence — I suddenly did not feel like giving in to the inevitable.

My intention had been to buy an upgrade to Windows NT Server, which was a completely sensible thing for me to be doing. A nice, clean, up-to-date system for an extra machine was the idea, somewhere to install my clients’ software; a reasonable, professional choice in a world where Microsoft platforms are everywhere. But somehow I left the store carrying a box of Linux from a company called Slackware. Linux: home-brewed, hobbyist, group-hacked. UNIX-like operating system created in 1991 by Linus Torvalds then passed around from hand to hand like so much anti-Soviet samizdat. Noncommercial, sold on the cheap mainly for the cost of the documentation, impracticable except perhaps for the thrill of actually looking at the source code and utterly useless to my life as a software engineering consultant.

But buying Linux was no mistake. For the mere act of installing the system — stripping down the machine to its components, then rebuilding its capabilities one by one — led me to think about what has happened to the profession of programming, and to consider how the notion of technical expertise has changed. I began to wonder about the wages, both personal and social, of spending so much time with a machine that has slowly absorbed into itself as many complications as possible, so as to present us with a fagade that says everything can and should be “easy.”

I began by ridding my system of Microsoft. I came of technical age with UNIX, where I learned with power-greedy pleasure that you could kill a system right out from under yourself with a single command. It’s almost the first thing anyone teaches you: Run as the root user from the root directory, type in rm -r f *, and, at the stroke of the ENTER key, gone are all the files and directories. Recursively, each directory deleting itself once its files have been deleted, right down to the very directory from which you entered the command: the snake swallowing its tail. Just the knowledge that one might do such great destruction is heady. It is the technical equivalent of suicide, yet UNIX lets you do it anyhow. UNIX always presumes you know what you’re doing. You’re the human being, after all, and it is a mere operating system. Maybe you want to kill off your system.

But Microsoft was determined to protect me from myself. Consumer-oriented, idiot-proofed, covered by its pretty skin of icons and dialog boxes, Windows refused to let me harm it. I had long ago lost my original start-up disk, the system was too fritzed to make a new one and now it turned away my subterfuges of DOS installation diskette, boot disks from other machines, later versions of utilities. Can’t reformat active drive. Wrong version detected. Setup designed for systems without an operating system; operating system detected; upgrade version required. A cascade of error messages, warnings, beeps; a sort of sound and light show — the Wizard of Oz lighting spectacular fireworks to keep me from flinging back the curtain to see the short fat bald man.

For Microsoft’s self-protective skin is really only a show, a lure to the determined engineer, a challenge to see if you’re clever enough to rip the covers off. The more it resisted me, the more I knew I would enjoy the pleasure of deleting it.

Two hours later, I was stripping down the system. Layer by layer it fell away. Off came Windows NT 3.51; off came a wayward co-installation of Windows 95 where it overlaid DOS. I said goodbye to video and sound; goodbye wallpaper; goodbye fonts and colors and styles; goodbye windows and icons and menus and buttons and dialogs. All the lovely graphical skins turned to so much bitwise detritus. It had the feel of Keir Dullea turning off the keys to HAL’s memory core in the film “2001,” each keyturn removing a “higher” function, HAL’s voice all the while descending into mawkish, babyish pleading. Except that I had the sense that I was performing an exactly opposite process: I was making my system not dumber but smarter. For now everything on the system would be something put there by me, and in the end the system itself would be cleaner, clearer, more knowable — everything I associate with the idea of “intelligent.”

What I had now was a bare machine, just the hardware and its built-in logic. No more Microsoft muddle of operating systems. It was like hosing down your car after washing it: the same feeling of virtuous exertion, the pleasure of the sparkling clean machine you’ve just rubbed all over. Yours. Known down to the crevices. Then, just to see what would happen, I turned on the computer. It powered up as usual, gave two long beeps, then put up a message in large letters on the screen:


What? Had I somehow killed off my read-only memory? It doesn’t matter that you tell yourself you’re an engineer and game for whatever happens. There is still a moment of panic when things seem to go horribly wrong. I stared at the message for a while, then calmed down: It had to be related to not having an operating system. What else did I think could happen but something weird?

But what something weird was this exactly? I searched the Net, found hundreds of HOW-TO FAQs about installing Linux, thousands about uninstalling operating systems — endless pages of obscure factoids, strange procedures, good and bad advice. I followed trails of links that led to interesting bits of information, currently useless to me. Long trails that ended in dead ends, missing pages, junk. Then, sometime about 1 in the morning, in a FAQ about Enhanced IDE, was the answer:


This should get a prize for the PC compatible’s most obscure error message. It usually means you haven’t made the primary partition bootable …

The earliest true-blue PCs had a BASIC interpreter built in, just like many other home computers those days. Even today, the Master Boot Record (MBR) code on your hard disk jumps to the BASIC ROM if it doesn’t find any active partitions. Needless to say, there’s no such thing as a BASIC ROM in today’s compatibles….

I had not seen a PC with built-in BASIC in some 16 years, yet here it still was, vestigial trace of the interpreter, something still remembering a time when the machine could be used to interpret and execute my entries as lines in a BASIC program. The least and smallest thing the machine could do in the absence of all else, its one last imperative: No operating system! Look for BASIC! It was like happening upon some primitive survival response, a low-level bit of hard wiring, like the mysterious built-in knowledge that lets a blind little mouseling, newborn and helpless, find its way to the teat.

This discovery of the trace of BASIC was somehow thrilling — an ancient pot shard found by mistake in the rubble of an excavation. Now I returned to the FAQs, lost myself in digging, passed another hour in a delirium of trivia. Hex loading addresses for devices. Mysteries of the BIOS old and new. Motherboards certified by the company that had written my BIOS and motherboards that were not. I learned that my motherboard was an orphan. It was made by a Taiwanese company no longer in business; its BIOS had been left to languish, supported by no one. And one moment after midnight on Dec. 31, 1999, it would reset my system clock to … 1980? What? Why 1980 and not zero? Then I remembered: 1980 was the year of the first IBM PC. 1980 was Year One in desktop time.

You Might Also Like

The computer was suddenly revealed as palimpsest. The machine that is everywhere hailed as the very incarnation of the new had revealed itself to be not so new after all, but a series of skins, layer on layer, winding around the messy, evolving idea of the computing machine. Under Windows was DOS; under DOS, BASIC; and under them both the date of its origins recorded like a birth memory. Here was the very opposite of the authoritative, all-knowing system with its pretty screenful of icons. Here was the antidote to Microsoft’s many protections. The mere impulse toward Linux had led me into an act of desktop archaeology. And down under all those piles of stuff, the secret was written: We build our computers the way we build our cities — over time, without a plan, on top of ruins.

My Computer. This is the face offered to the world by the other machines in the office. My Computer. I’ve always hated this icon — its insulting, infantilizing tone. Even if you change the name, the damage is done: It’s how you’ve been encouraged to think of the system. My Computer. My Documents. Baby names. My world, mine, mine, mine. Network Neighborhood, just like Mister Rogers’.

On one side of me was the Linux machine, which I’d managed to get booted from a floppy. It sat there at a login prompt, plain characters on a black-and-white screen. On the other side was a Windows NT system, colored little icons on a soothing green background, a screenful of programming tools: Microsoft Visual C++, Symantec Visual Cafe, Symantec Visual Page, Totally Hip WebPaint, Sybase PowerBuilder, Microsoft Access, Microsoft Visual Basic — tools for everything from ad hoc Web-page design to corporate development to system engineering. NT is my development platform, the place where I’m supposed to write serious code. But sitting between my two machines — baby-faced NT and no-nonsense Linux — I couldn’t help thinking about all the layers I had just peeled off the Linux box, and I began to wonder what the user-friendly NT system was protecting me from.

Developers get the benefit of visual layout without the hassle of having to remember HTML code.

– Reviewers’ guide to Microsoft J++

Templates, Wizards and JavaBeans Libraries Make Development Fast
– Box for Symantec’s Visual Cafe for Java

Simplify application and applet development with numerous wizards
– Ad for Borland’s JBuilder in the Programmer’s Paradise catalog

Thanks to IntelliSense, the Table Wizard designs the structure of your business and personal databases for you.
– Box for Microsoft Access

Developers will benefit by being able to create DHTML components without having to manually code, or even learn, the markup language.
– Review of J++ 6.0 in PC Week, March 16, 1998.

Has custom controls for all the major Internet protocols (Windows Sockets, FTP, Telnet, Firewall, Socks 5.0, SMPT, POP, MIME, NNTP, Rcommands, HTTP, etc.). And you know what? You really don’t need to understand any of them to include the functionality they offer in your program.
– Ad for Visual Internet Toolkit from the Distinct Corp. in the Components Paradise catalog

My programming tools were full of wizards. Little dialog boxes waiting for me to click “Next” and “Next” and “Finish.” Click and drag and shazzam! — thousands of lines of working code. No need to get into the “hassle” of remembering the language. No need to even learn it. It is a powerful siren-song lure: You can make your program do all these wonderful and complicated things, and you don’t really need to understand.

In six clicks of a wizard, the Microsoft C++ AppWizard steps me through the creation of an application skeleton. The application will have a multidocument interface, database support from SQL Server, OLE compound document support as both server and container, docking toolbars, a status line, printer and print-preview dialogs, 3-D controls, messaging API and Windows sockets support; and, when my clicks are complete, it will immediately compile, build and execute. Up pops a parent and child window, already furnished with window controls, default menus, icons and dialogs for printing, finding, cutting and pasting, saving and so forth. The process takes three minutes.

Of course, I could look at the code that the Wizard has generated. Of course, I could read carefully through the 36 generated C++ class definitions. Ideally, I would not only read the code but also understand all the calls on the operating system and all the references to the library of standard Windows objects called the Microsoft Foundation Classes. Most of all, I would study them until I knew in great detail the complexities of servers and containers, OLE objects, interaction with relational databases, connections to a remote data source and the intricacies of messaging — all the functionality AppWizard has just slurped into my program, none of it trivial.

But everything in the environment urges me not to. What the tool encourages me to do now is find the TODO comments in the generated code, then do a little filling in — constructors and initializations. Then I am to start clicking and dragging controls onto the generated windows — all the prefabricated text boxes and list boxes and combo boxes and whatnot. Then I will write a little code that hangs off each control.

In this programming world, the writing of my code has moved away from being the central task to become a set of appendages to the entire Microsoft system structure. I’m a scrivener here, a filler-in of forms, a setter of properties. Why study all that other stuff, since it already works anyway? Since my deadline is pressing. Since the marketplace is not interested in programs that do not work well in the entire Microsoft structure, which AppWizard has so conveniently prebuilt for me.

This not-knowing is a seduction. I feel myself drifting up, away from the core of what I’ve known programming to be: text that talks to the system and its other software, talk that depends on knowing the system as deeply as possible. These icons and wizards, these prebuilt components that look like little pictures, are obscuring the view that what lies under all these cascading windows is only text talking to machine, and underneath it all is something still looking for a BASIC interpreter. But the view the wizards offer is pleasant and easy. The temptation never to know what underlies that ease is overwhelming. It is like the relaxing passivity of television, the calming blankness when a theater goes dark: It is the sweet allure of using.

My programming tools have become like My Computer. The same impulse that went into the Windows 95 user interface — the desire to encapsulate complexity behind a simplified set of visual representations, the desire to make me resist opening that capsule — is now in the tools I use to write programs for the system. What started out as the annoying, cloying face of a consumer-oriented system for a naive user has somehow found its way into C++. Dumbing-down is trickling down. Not content with infantilizing the end user, the purveyors of point-and-click seem determined to infantilize the programmer as well.

But what if you’re an experienced engineer? What if you’ve already learned the technology contained in the tool, and you’re ready to stop worrying about it? Maybe letting the wizard do the work isn’t a loss of knowledge but simply a form of storage: the tool as convenient information repository.

(To be continued.)

Ellen Ullman is a software engineer. She is the author of "Close to the Machine: Technophilia and its Discontents."

More Related Stories

Featured Slide Shows

  • Share on Twitter
  • Share on Facebook
  • 1 of 8
  • Close
  • Fullscreen
  • Thumbnails

    7 ways Americans have defiled the hot dog

    Sonic's Bacon Double Cheddar Croissant Dog

    Sonic calls this a "gourmet twist" on a classic. I am not so, so fancy, but I know that sprinkling bacon and cheddar cheese onto a tube of pork is not gourmet, even if you have made a bun out of something that is theoretically French.

    Krispy Kreme

    7 ways Americans have defiled the hot dog

    Krispy Kreme's Doughnut Dog

    This stupid thing is a hotdog in a glazed doughnut bun, topped with bacon and raspberry jelly. It is only available at Delaware's Frawley Stadium, thank god.


    7 ways Americans have defiled the hot dog

    KFC's Double Down Dog

    This creation is notable for its fried chicken bun and ability to hastily kill your dreams.

    Pizza Hut

    7 ways Americans have defiled the hot dog

    Pizza Hut's Hot Dog Bites Pizza

    Pizza Hut basically just glued pigs-in-blankets to the crust of its normal pizza. This actually sounds good, and I blame America for brainwashing me into feeling that.

    Carl's Jr.

    7 ways Americans have defiled the hot dog

    Carl's Jr. Most American Thick Burger

    This is a burger stuffed with potato chips and hot dogs. Choose a meat, America! How hard is it to just choose a meat?!

    Tokyo Dog

    7 ways Americans have defiled the hot dog

    Tokyo Dog's Juuni Ban

    A food truck in Seattle called Tokyo Dog created this thing, which is notable for its distinction as the Guinness Book of World Records' most expensive hot dog at $169. It is a smoked cheese bratwurst, covered in butter Teriyaki grilled onions, Maitake mushrooms, Wagyu beef, foie gras, black truffles, caviar and Japanese mayo in a brioche bun. Just calm down, Tokyo Dog. Calm down.


    7 ways Americans have defiled the hot dog

    Limp Bizkit's "Chocolate Starfish and the Hot Dog Flavored Water"

    This album art should be illegal.

  • Recent Slide Shows



Comment Preview

Your name will appear as username ( settings | log out )

You may use these HTML tags and attributes: <a href=""> <b> <em> <strong> <i> <blockquote>