Why unmoderated online forums always degenerate into fascism

8chan was not unique: selection biases and online psychology mean all unmoderated forums will devolve

By Keith A. Spencer

Senior Editor

Published August 5, 2019 7:39PM (EDT)

 (AP/Getty/Salon)
(AP/Getty/Salon)

8chan, an image and message board modeled after 4chan but committed to even less moderation, is in the news again after the revelation that the El Paso shooter used the forum to post his far-right manifesto moments before his killing spree. If confirmed, that would mark the third time a right-wing mass shooter has posted his plans and/or manifesto on the site.

8chan has faced massive criticism in the past 48 hours, and the original founder has even called to shut it down. Yet such public hand-wringing, and the fixation on 8chan alone, belies the true problem. For it is not merely 8chan's peculiar situation that makes it a haven for the far-right and white nationalism. Any online forum with the same combination of anonymity and lack of moderation will experience the same degeneration. Talking about 8chan as though it were unique in some way as a right-wing, white nationalist haven is disingenuous; the problem is far vaster, and far more intrinsic to the way that the internet functions.

“Online disinhibition” is the term psychologists use to explain why anonymous online masses behave in a manner that is crueler and more hostile than interactions we experience in real life. As I’ve written before, the lack of accountability, relative ease of anonymity, and the indirect nature of online communication combine to create an oft-toxic atmosphere online. Thus, on a forum like Twitter or Reddit, one person attempting to rebut another person's point is much more likely to exhibit hostility, to swear at or dehumanize them.

In essence, it is hard to remember an ideological opponent’s humanity online, as you cannot see nor perceive them — they are merely words on a screen. Those with larger online followings, or who post about politics, or who are women or another minority group, suffer more hatred and attacks from strangers. This, in part, explains why irony has become such a common online personality trait, particularly among Extremely Online millennials; adopting an irony-poisoned disposition is actually a great way to protect oneself from these kinds of attacks. Indeed, approaching online communication with ironic distance is a good defense against cruelty.

Yet the major online forums — sites like Reddit, Facebook, Twitter and YouTube — all include moderation tools to some degree. Intensely cruel, sexist or racist comments on any of those platforms are often filtered out so most users never see them.

Yet there are some sites whose creators eschew moderation in favor of online free-for-alls — and this category includes the -chan forums, like 4chan and, yes, 8chan.

The creators of these kinds of sites often fancy themselves free speech activists, as though being against moderation and in favor of free speech were the same thing (they're not). If these forums are marketplaces of ideas, it is curious that they have become dens of far-right hate — fringe political positions, and not at all representative of how most online denizens think and feel. How did the -chan sites devolve to this state?

To put it bluntly, the -chan forums have experienced what we call a “selection effect” in statistics. In other words, the people who spend time on, say, 8chan, and who express themselves on that site, are not a random sample size of the population; there is a bias that creeps in. Angela Nagle, perhaps the foremost chronicler of the history of the online alt-right, has pointed out before that these sites have a long history of users trafficking in transgressive, edgy, sometimes ironic and sometimes earnest far-right and white supremacist language and memes.

Of course, these -chan sites are open to anyone; you or I could navigate to 4chan right now and post whatever we want, without having to create an account, give up an email address or post a profile pic. Yet over time, those who have a distaste for edgy or white supremacist or Nazi posts will vacate the premises, leaving only those who can stomach it — who, again, are a mix of earnest fascists and nihilistic trolls.

Thus, in time, those who are earnest about their white supremacist or fascist feelings will feel at home there, and come there earnestly — and those types will outweigh those of other political stripes, and even outweigh the nihilist posters and lurkers who view such content with ironic detachment.

Hence message boards like 8chan can and will devolve into havens for far-right hate, at first because some users post such shocking content jokingly, and some in earnest, until the racist and hateful voices drown out the others.

Notably, 8chan regarded itself as a true protector of so-called “free speech,” more so than 4chan, whose moderators had deleted some of the more libelous GamerGate-related posts.

I don’t relish having to report this or diagnose how these things work. This should be regarded as some kind of online psychology law, so that we can stop misunderstanding what “free speech” is and we can stop letting the right claim that term as though they actually understand what it means.

Moreover, that unmoderated and unregulated online spaces will devolve in this manner should give libertarians pause. Anyone who thinks a lack of regulation will result only in good things needs to re-assess their ideology.

In any case, be very wary of any unmoderated online site that bills itself as a haven or a protector of “free speech.” Because of the selection effect and the reasons listed above, such forums will almost always devolve into fascist or white nationalist garbage fires, even as their administrators cry foul at those who seek to ban or moderate their excesses. This is just how it works: The combination of online psychology (the disinhibition effect), selection bias, and the weeding out of those who can’t stomach it will inevitably result in just another 8chan. 8chan may or may not be gone, but it is likely that another e-nihilist will naively start another unmoderated online forum, claim the mantle of free speech, and shrug his shoulders at all the far-right content right up until the minute that a mass shooter uploads their plans.


By Keith A. Spencer

Keith A. Spencer is a social critic and author. Previously a senior editor at Salon, he writes about capitalism, science, labor and culture, and published a book on how Silicon Valley is destroying the world. Keep up with his writing on TwitterFacebook, or Substack.

MORE FROM Keith A. Spencer