Trashing the flamers

Trashing the flamers: By Mike Godwin. An online civil libertarian discovers the proper uses of "censorware" software filters

By Mike Godwin
Published May 15, 1998 7:00PM (EDT)

I've often said publicly that I'll never impose software filters on my daughter (now 5 years old and happily computing) -- that my job as a parent is aimed at preparing my child for life, not hiding life from her.

What I never anticipated was that the harshest critics of "censorware" would persuade me to use blocking software myself.

And, in fact, they've caused me to rethink my feelings about software filters -- to reject the notion that they are necessarily repressive, or that there is something inherently pernicious about the technology. I don't hate government censorship any less, but I have a newfound respect for those individuals who, for whatever reason, choose to screen what they see online.

First, a little history: I've been fighting for free speech on the Net for a long time -- for much of my career as a writer, and for my whole professional career as a lawyer and as counsel to the Electronic Frontier Foundation. In that role, I've worked on free-speech issues almost every day, and helped develop legal theories about freedom of speech on the Net that became part of the successful challenge to the constitutionality of the Communications Decency Amendment.

Our win in that case was based on a number of complementary constitutional theories, one of which was an argument that, reduced to its essence, goes something like this: The Communications Decency Amendment's ban on so-called indecent online expression is not the least restrictive means of achieving whatever legitimate goal the government may pursue in its efforts to protect children. The existence and ongoing development of software that can be used to filter out such speech when it comes from the Internet is an example of a less-restrictive alternative that achieves substantially the same goals.

This argument is based on a standard constitutional test of the validity of any government action aimed at restricting constitutionally protected speech (and what the government calls "indecent" speech is nonetheless protected by the First Amendment). Such content regulations have to be shown -- by the government, which bears the burden of proof -- to be the least restrictive means of achieving the government's goal.

I had made various versions of this argument in speeches and articles since 1994, and representatives of companies or organizations developing filtering schemes joined in the anti-CDA fight. But despite the fact that I saw strategic value in the existence of filters, I was never a fan of the software or of any of the proposed rating schemes for Net content. All of the products and systems I saw were deeply flawed in several respects -- and, as reporters Brock Meeks and Declan McCullagh later revealed, the stand-alone software packages often seemed to have antigay, antisex or antifeminist agendas in the lists of sites and words that were blocked.

For many critics of "censorware," this was all they needed to know: It meant these products were in some sense evil, since they incorporated some less-than-desirable opinions and blocked content on the basis of those opinions.

But I couldn't join in their condemnation, in large part because I saw a sort of paternalism in it that was just as disturbing as that of the CDA. If there was anything that I'd learned as a civil libertarian and student of the First Amendment, it was the necessity for tolerating those whose views I disagreed with. Surely this included those parents who might choose to use a content-blocking product that I'd never use -- a product whose content choices I personally might find abhorrent. After all, nobody was compelling me to listen to those parents, or to accept the agendas of the blocking lists, or to use the stuff on my little girl. In an open society, we allow people to speak their minds for the most part; we also don't require everyone else to listen to the speakers.

In a number of online forums -- mailing lists, in particular -- I began to express this view. The result? I was suddenly attacked for being a tool of the censorware vendors -- perhaps even in their pay! (It was also asserted, equally falsely, that EFF must have gotten donations based on my public "support" of filtering software.) Not that I was uniformly attacked at first -- occasionally one of the anticensorware guys would try to win me over to the One True Way. When one of the opponents of filtering software first brought to my attention how he'd discovered that one product blocked gay sites, feminist sites and even the home page for my own organization, EFF, he expected me to be horrified. But I couldn't muster up enough outrage -- while I didn't approve of these choices on the part of the programmers, I knew that for many parents this content blocking qualified as a feature, not a bug.

For me, an essential part of being a pluralist is tolerance for other people's views. Another part is tolerance for people's unwillingness to hear other people's views. Which means that the true pluralist should hesitate before trying to impose pluralism on everyone else.

Predictably, this nuanced view led me to be condemned as a sort of cyber-Girondist by the anticensorware Jacobins. To judge from the intense fervor of their stepped-up attacks on me, plus the increasingly fantastical character of those attacks, I'm certain they'd have guillotined me if they could.

In both my personal and professional online life, I'm known to be an acerbic critic of those I disagree with; I'm also known to be a thin-skinned one. But for years I'd also committed myself to engaging with those I argued with for as long as I could -- and, indeed, the sight of a slam at me in some online forum would often re-energize me to write a point-by-point response.

But the length and severity of these attacks by the Jacobins was something new to me. The inventiveness of the fabrications about me and about my views, along with the obsessive need to follow me from forum to forum for the purpose of discrediting me, disheartened me in a way I never had been before. At first I thought I had to respond to the attacks (typically from people whose own work toward freedom of speech scarcely extended beyond their indulgence in flame wars against the heterodox). But it became increasingly clear to me that my responses -- no matter how successful I was in framing them calmly and logically on the one hand or with passionate outrage on the other -- were doing me no good. The Jacobins, clearly pleased when they struck a nerve, would redouble their efforts, "explaining" how my claims were invariably a lie or a ploy, and discounting my record as a free-speech advocate. And the rest of the participants in the forum would invariably decide that both the attacks on me and my responses to them were tedious distractions from the real issues at hand.

It was my dismay at the Sisyphean task of rebutting these gratuitous attacks that led me to resign from one cyberlaw-oriented list that I'd been part of since its beginning four or five years before. At this point, I quit participating in most online forums altogether, and concentrated instead on revising the draft of my first book and preparing to move to New York, where I was about to begin a fellowship.

But I knew I would go back someday. And when I did, I was armed with a tool that promised to make my online quality of life a little better: the "Filters" component of the e-mail software called Eudora Pro, which allowed me to choose never to see e-mail sent by any of the Jacobins. And with this tool in "hand," I began to resume participating in a number of online forums.

I discovered to my delight that, by blocking e-mail whose "From" header included the name of any of the Jacobins, I never had to see what they said about me. Or at least not directly -- while I could have blocked most messages that even quoted a Jacobin's message, I chose not to, since I didn't want to be that distant from the stream of discussion, and since other people's responses to their attacks on me were a good gauge as to whether I needed to say anything at all. I found that by sparing myself these sallies that so often appealed to the devils of my worst nature, I could participate with a greater degree of equanimity. It was good both for my blood pressure and for my public image not to read these guys, much less engage with them.

But my delight in being able to "censor" the nastiest of my critics led me to think through the consequences of my choosing to use this technology. Was I somehow promoting an anti-free-speech agenda by using this software to cleanse my online experience of much of the content that caused me pain? Or was I doing something rather different -- demonstrating an underlying validity to people's choosing to block content they don't like?

These reflections, plus some thoughts about how computing technologies normally work when they block content, led me to the following conclusions about filtering software/censorware:

1. Filtering text is not hard. Searching for patterns in incoming text streams is one of the best and simplest things computers can do.

2. All filtering of text works more or less in the same way. That is, it searches for particular strings of characters, which may be words or Web addresses or PICS-compliant rating tags. There is little difference between what Eudora does, when it filters messages, and what Surfwatch does: Both applications look for certain strings of characters within messages and decide what to do with the messages on the basis of the presence of those character strings.

3. No matter what the critics of "censorware" try to do about it, the market for content-filtering software is only going to grow. Even those who don't use Eudora's filtering to censor offensive content, for example, will use the feature for routing incoming mail, for handling some kinds of e-mail spam and so on. For many people, the world is already too flooded with information -- this is a theme of David Shenk's book "Data Smog" -- and the development of tools that help manage this "information glut" can only be a growth market.

4. All filtering of text that functions at the user level can be imposed at the level of the server. "Server-level filtering" is a bête noire of Internet free-speech advocates -- it means that the folks running the larger systems, the network nodes and Internet service providers, are making content choices long before you get to see the content. But there's no technological barrier to a server-level implementation of my Eudora filters.

Of course, I'm not the only person to have seen such similarity between server-level content filtering and filtering by the individual; the Jacobins have done so as well. Although most of the mainstream public debate has been about government censorship and not individual use, many Jacobins have decried both server-level and user-level products, and, in fact, it was criticisms of user-level products rather than server-level ones that seeded their movement. Arguing an essential identity between server-level and user-level filtering, some of them have suggested not only that government should be barred from blocking content, but also that individuals should, perhaps, be barred from this as well.

5. This means that PICS isn't "the devil" (as law professor Larry Lessig put it last year); it's simply the devil du jour. In some respects PICS -- the Platform for Internet Content Selection, a ratings-scheme protocol -- is less dangerous to free speech than a product like CyberPatrol, since the former requires multilateral, international cooperation in order to work. (For more on the subject, read my note on PICS.)

When I have argued publicly that there is little fundamental difference between what Eudora's filters do and what, say, Surfwatch does, my interlocutors have objected that I'm comparing apples and, uh, Apples. "Censorware," they say, "is about censoring, whereas Eudora is designed to improve your handling of e-mail." One might just as easily say, however, that Eudora is designed to censor the bad experiences associated with receiving e-mail, and Surfwatch is designed to improve the experience of the Web for users' children. It's just a question of what rhetoric you choose.

But isn't it time to put aside the rhetoric -- both kinds of it! -- altogether? One of the most unhelpful principles articulated in the Technorealists' recent nonmanifesto is that "Technology is not neutral." While this may be true of a few technologies (the thumbscrew and the rack come to mind), the use of most tech is grounded less in the technology itself than in the minds, and morals, of its users.

It is the rare technology that compels its own use for evil purposes. Even botulin toxin, we now know, has its commercial cosmetic uses, and, even more remarkably, no country has used a nuclear weapon in anger even once since the year that particular technology was invented. And it is equally uncommon to find a "benign" technology that cannot be used oppressively. (Imagine what the KGB might have done if it had been able to keep its information on dissidents current with Filemaker Pro.) As I wrote in a brief piece in Wired last year, when I first began reflecting on this issue, what makes more sense than this constant, divisive bickering over software is to hate the censors, not the technologies. Anything else is software animism: In a sort of tribalistic way, it makes enemies of those who don't share the Jacobins' beliefs that this or that software is "the devil."

It's important to stress here that nothing about my experiences with the Jacobins has made me oppose government censorship any less, or oppose server-level filtering or the mandatory use of filters any less. And I still believe that the agendas and assumptions inherent in many content-filtering products and schemes are quite pernicious.

But the great irony of all these reflections about filtering software is that they forced me to discover for myself the value of censoring disturbing content. In short, the anticensorware Jacobins taught me to appreciate content filters.

That isn't all I've learned. While I'm still in the habit of acerbic postings, I've increasingly found myself rewriting harsh messages before I post them, or even apologizing, publicly and privately, when I've found that my critical comments have gone over the top. I suppose I feel a little more sympathy now for many of the onetime objects of my online scorn, and so I try more now to rein myself in a little. Even though I remain a devout civil libertarian, this small improvement in my posting style is a kind of censorship I can love.

Mike Godwin

Mike Godwin is staff counsel for the Electronic Frontier Foundation and a fellow at the Media Studies Center. His book, "Cyber Rights: Defending Free Speech in a Digital Age," will be published by Times Books this summer.

MORE FROM Mike Godwin

Related Topics ------------------------------------------