Revenge of the nerd-kings: Why some in Silicon Valley are advocating for monarchy

The tech industry's glorification of its leaders has convinced some they should rule over us like kings

By Keith A. Spencer

Senior Editor

Published April 13, 2019 2:00PM (EDT)

 (Getty/Salon)
(Getty/Salon)

Shortly after World War II, when Europe lay in ruin and humanity was newly traumatized by the spectacle of organized violence that an authoritarian regime could achieve in the industrial epoch, the Western World experienced a sudden cultural shift. This new regime of thought is sometimes called postmodernism, but that term is obscure and overused; a better way to think about this is that there was no longer a unifying narrative, a guiding thread that united humans in the West. Whereas some countries might have previously had religious bonds, or ethnic bonds, or monarchial bonds, or even political bonds around, say, an authoritarian leader, suddenly there were none anymore — or at least none that were universally believed. Individualism and identity were more important, and politicians and legal bodies would now have to consider how to govern subjects in an ambiguous, pluralistic, multicultural world.

At the same time, it was becoming clear that the forces that shaped the world — the power to  organize society, or to exterminate it — were in the hands of scientists and technologists. The atom bomb, the intercontinental ballistic missile, the radio, the car, electrification, the refrigerator and the moon landing all happened in a span of about a hundred years. Science and technology spurred World War II, and led to its conclusion. And as the war receded from memory, it was apparent that the areas of greatest economic growth were all in technical fields — computers, engineering, communications, biotech and material science.

Jean-Francois Lyotard, a French philosopher who studied the condition of knowledge in this new era, realized that technology had changed the way that humans even thought about what knowledge was. Knowledge that computers could not process or manage — for instance, the ability to think critically or analyze qualitatively — was increasingly devalued, while the kinds of knowledge that computers could process became more important. As Lyotard wrote:

The miniaturisation and commercialisation of machines is already changing the way in which learning is acquired, classified, made available, and exploited....The nature of knowledge cannot survive unchanged within this context of general transformation. It can fit into the new channels, and become operational, only if learning is translated into quantities of information. We can predict that anything in the constituted body of knowledge that is not translatable in this way will be abandoned... Along with the hegemony of computers comes a certain logic, and therefore a certain set of prescriptions determining which statements are accepted as “knowledge” statements.

Lyotard wrote this in 1978, before the modern internet even existed. Today, the idea that computational forms of knowledge — and/or the kinds of people who traffic in that knowledge — are more valuable to our society seems to be universal. Thanks to generous grants from the tech industry and well-heeled nonprofits like the Mellon Foundation, humanities academics across the world have been spurred to do more research in what is called the "digital humanities" — a vague term that often means applying statistical and quantitative tools to data sets that involved humanities research, such as literary corpuses. The tech industry investments in digital humanities fulfills Lyotard's prophecy that society would cease to see the humanities' brand of knowledge as useful; that it would attempt remake the humanities into a discipline characterized by discrete information, rather than a means of analyzing, considering, and philosophizing the world.

In the same essay, Lyotard actually distinguishes between two different types of knowledge: the "positivist" kind, that is applicable to technology; and the "hermenutic" kind of knowledge. Hermeneutics, meaning the study of interpretation, is what the humanities (and to some extent social sciences) concerns itself with. One can see how this kind of knowledge might be difficult for computers to catalogue and use. The idea that a computer could produce a literary analysis of a Vonnegut short story sounds absurd because it is: this is not the way that computers process data, this is not what humans generally regard computers as useful for, and it is certainly not what they are designed to do by the tech companies. Unsurprisingly, then, this type of humanities knowledge has become devalued, and not even considered "knowledge" by many.

So this leads us to a predicament in which slowly, since the postwar era, humanities skills and associated knowledge have been devalued, while STEM knowledge — an acronym for "Science, Tech, Engineering and Math," meaning the kind of quantitative knowledge associated with technology — reigns supreme. One of the most interesting places that you can see this trend is in fiction: the kinds of heroes and protagonists that people admire and look up to in fiction are increasingly those with STEM knowledge, as these people are seen as heroes because we uncritically accept that STEM knowledge is what changes the world. There is a reason that Iron Man is a billionaire technologist, and Batman is a billionaire technologist, and The Hulk's namesake Bruce Banner has multiple PhDs in the Marvel canon, and that the mad scientist Rick Sanchez (of "Rick and Morty") is essentially an immortal, infinitely powerful being because of his ability to understand science and wield technology. We admire these people because they possess the kinds of skills that our society deems the most valuable, and we're told that we, like them, can use these skills to master the universe.

(There is a potent irony here, of course, in that it is artists who write these narratives, and artists who are partly responsible for creating and popularizing this kind of STEM-supremacist propaganda. Weirdly, though, you rarely see a superhero or a super-spy who started life as a painter, or a novelist, or a comic book artist.)

Moreover, in real life, people who possess technological knowledge, primarily the scions of Silicon Valley, are widely adulated, viewed as heroes who will inherently change society for the better. This manifests itself in various ways: some technologists, like Bill Gates and Mark Zuckerberg have set up philanthropic foundations to "solve" our social problems — though curiously, the means by which that happens always seems to enrich themselves and their fellow capitalists along the way. Some of them promise widespread social change for the better via their own businesses, as though running a for-profit tech company was in and of itself a gift to the world and a net positive for social cohesion: you see this in many tech companies that advertise themselves as operating "for good," such as in the PR rhetoric of Facebook.  Then, there are those who believe that their contribution to society will be helping us leave this planet, and who are investing heavily in private spaceflight companies with the ultimate intention of colonizing space; this includes both Elon Musk and Jeff Bezos.

In all these cases, the idea that people with STEM knowledge are predestined to save the world is an idea has become so dominant we don’t even question it. Some call this attitude STEM chauvinism, though I prefer the moniker STEM Supremacy. The noun "supremacy," I believe, is called for, because of how the idea that STEM knowledge (and those who posses it) is superior to other forms of knowledge has become so hegemonic that our culture openly mocks those who possess other forms of knowledge — particularly the hermeneutic, humanities-type knowledge. There is a fount of memes about humanities majors and how useless their fields are; some of these memes depict humanities majors as graduating to working at low-wage jobs like McDonalds; others mock critical humanities majors (particularly gender studies) as being out-of-touch, social failures.

Such discourse is intersectional with other supremacist beliefs, such as patriarchy, and often these kinds of memes that celebrate STEM knowledge and mock humanities knowledge will simultaneously mock women and celebrate masculinity. It was unsurprising to me when, last year, it leaked that a Google engineer, James Damore, had circulated an anti-diversity manifesto in which he used discredited science to argue that there were biological reasons for the gender gap. He went on to argue that there were reasons men were more interested in computers and in leadership, and women less. Though Damore was fired, he maintains that many of his peers agreed with him. Such incidents speak to the ways that different chauvinist tendencies, one of STEM Supremacy and one of patriarchy, can intersect to form novel noxious political ideologies.

The concept of "STEM Supremacy" relies on a popular belief that STEM knowledge is synonymous with progress. Yet if you take this kind of belief a bit too far, you might be keen to abandon democratic ideals and start to believe that we really should live in a society in which the STEM nerds rule over us. This has resulted in a number of half-baked supremacists within the tech industry who advocate either for authoritarian technocracies or, more bizarrely, monarchy.

I’ll give a few brief examples. There’s Google engineer Justine Tunney, a former Occupy Wall Street activist who now calls for “open-source authoritarianism. ” Tunney has argued against democracy and in favor of a monarchy run by technologists, and advocated for the United States to bring back indentured servitude.

But perhaps best-known among the techno-monarchists is Mencius Moldbug, the nom de plume of Curtis Yarvin, a programmer and founder of startup Tlon — a startup that is backed at least in part by billionaire anti-democracy libertarian Peter Thiel, who famously once wrote he did not believe democracy and freedom were compatible, and expressed skepticism over women's suffrage. Moldbug's polemics are circular, semi-comprehensible, and blur political theory and pop culture; Corey Pein of The Baffler described his treatises as "archaic [and] grandiose," while being "heavily informed by the works of J.R.R. Tolkien and George Lucas."

Both of these so-called thinkers constitute parts of a larger movement that calls itself "Dark Enlightenment," alternatingly known as "neoreactionaries." True to its name,  the political agenda of Dark Enlightenment includes a celebration of patriarchy, monarchy, and racialized theories of intelligence differentials. 

The notion that monarchy is popular again in Silicon Valley might sound absurd. We associate monarchies with stodgy, quaint medieval kingdoms, the opposite of the disruptive, fast-moving tech industry. And yet those in the tech industry who see monarchy as appealing are keen to point out how the hierarchical aspects of monarchial rule are actually familiar to their industry. As Pein mentions in his Baffler essay, Thiel delivered a lecture in 2012 in which he explained the connection:

A startup is basically structured as a monarchy. We don’t call it that, of course. That would seem weirdly outdated, and anything that’s not democracy makes people uncomfortable.

[But] it is certainly not representative governance. People don’t vote on things. Once a startup becomes a mature company, it may gravitate toward being more of a constitutional republic. There is a board that theoretically votes on behalf of all the shareholders. But in practice, even in those cases it ends up somewhere between constitutional republic and monarchy. Early on, it’s straight monarchy. Importantly, it isn’t an absolute dictatorship. No founder or CEO has absolute power. It’s more like the archaic feudal structure. People vest the top person with all sorts of power and ability, and then blame them if and when things go wrong.

[T]he truth is that startups and founders lean toward the dictatorial side because that structure works better for startups. It is more tyrant than mob because it should be. In some sense, startups can’t be democracies because none are. None are because it doesn’t work. If you try to submit everything to voting processes when you’re trying to do something new, you end up with bad, lowest common denominator type results.

The underpinnings of STEM Supremacy are, as I've laid out, complicated to see and stretch back to the end of World War II — but when put together they form a broader picture of where the philosopher-kings of the tech industry are heading, and what they believe. If we continue to live in a society that devalues humanities-type knowledge and glorifies STEM knowledge, this kind of thinking will persist, I fear. And the tech industry is partly responsible for cultivating this noxious worldview, in the sense that their PR apparatuses glorify STEM knowledge and encourage the public to view their leaders as demigods.

This isn't a unique phenomenon. Any situation where a certain ideology is denigrated and another valorized, there will be at some point a corresponding rise in a chauvinism in favor of the valorized ideology. The situation today is made more complicated by the fact that the tech industry benefits from the normalization of STEM Supremacist beliefs. The unearned trust that the public has for tech startups and tech industry ideas, the lack of regulation, and the absurd valuations of companies that continue to lose money — this is all motivated by an underlying belief that these companies are innately good, their owners smart, and their work more vital than other fields. Whether they admit it or not, you can draw a line from the public relations departments of tech companies and Justine Tunney's call for "open-source authoritarianism."

Ironically, the only antidote to all this sophistry is the humanities — the kind of critical thinking that they entail, and the kind of thinking that it is impossible for computers to do. I've often wondered if part of the tech industry's investment in digital humanities is designed to help stave off critical discourse or criticism of their companies. Indeed, by remapping the idea of what knowledge is in the first place, the tech industry is helping to realize a future in which we lack even the language to think critically about their role in society. Or maybe even a future in which they rule over us as monarchial, benevolent dictators — at least in their eyes. Perhaps this was the plan all along.


By Keith A. Spencer

Keith A. Spencer is a social critic and author. Previously a senior editor at Salon, he writes about capitalism, science, labor and culture, and published a book on how Silicon Valley is destroying the world. Keep up with his writing on TwitterFacebook, or Substack.

MORE FROM Keith A. Spencer