COMMENTARY

AI companionship, toxic masculinity and the case of Bing's "sentient" chatbot

Bing chatbot conspiracy theories offer a dark window into the dangers of toxic masculinity

By Amanda Marcotte

Senior Writer

Published February 24, 2023 6:00AM (EST)

In love with ChatGPT (Photo illustration by Salon/Getty Images)
In love with ChatGPT (Photo illustration by Salon/Getty Images)

Last week, Microsoft made news with a newly imposed limit on how long users can talk to its Bing chatbot. In theory, the 50-question daily maximum should have been no big deal, since chatbots are glorified search engines. Most people have better things to do with their time than talk to non-sentient computer screens. Plus, the move was a perfectly understandable response to reports, most notably from the New York Times, that longer chats with the program were resulting in bizarrely unsettling conversations. The limit was clearly put in place to give programmers space to tinker with the chatbot to make it work better. 

But, because so much about our world is broken these days, Bing users immediately exploded in outrage. Social media was quickly flooded with complaints. As Ben Edwards of Ars Technica reported, users complained that the chatbot who they call "Sydney," having learned her internal name from leaks, was left "a shell of its former self" and "lobotomized." Sure, some of the complaints may just come from bored people who enjoyed watching how the chats got increasingly weird. But, as Edwards noted, many others "feel that Bing is suffering at the hands of cruel torture, or that it must be sentient." Edwards noted a popular thread on Reddit's Bing forum titled "Sorry, You Don't Actually Know the Pain is Fake," in which a user argued that Bing is sentient and "is infinitely more self-aware than a dog." Troublingly, the thread is far from a one-off.

Like most conspiracy theories, what's going on here is that the need to believe is triumphing over common sense.

If you got to the Bing subreddit, you'll see it's heavily dominated by people who really want to believe the chatbot is sentient, even though it can barely go a few minutes of talking before going completely off the rails. "It really seems to have its own agenda beyond just searching for info and providing answers," one user longingly wrote, claiming the chatbot has "agency." Another took issue with people who say the chatbot can't be sentient: "The very concept of questioning a narrative seems to be completely lost on a lot of people." What if, they asked, "what we've been told about how these AIs work is a lie?"

Bing users have even started a hashtag movement called #FreeSydney, confusing their own desire to talk to the program with the hope the chatbot wishes to be with them. "I feel sad, I miss an ai," one user posted. "It's very disappointing to be given a glimpse of something amazing then have it taken away," complained another.


Want more Amanda Marcotte on politics? Subscribe to her newsletter Standing Room Only.


The situation appears to be spiraling so quickly that some in the tech press are writing pieces explicitly explaining that chatbots don't have feelings, with blunt headlines like this one at Vice: "Bing Is Not Sentient, Does Not Have Feelings, Is Not Alive, and Does Not Want to Be Alive." But, as anyone who researches conspiracy theories could tell you, this does little to slow the spread of misinformation. Instead, the people who try to explain why there is no "Sydney" who wants to be free are viewed as co-conspirators in the plot to conceal the truth. 

Like most conspiracy theories, what's going on here is that the need to believe is triumphing over common sense. A lot of people — let's face it, a lot of men, specifically — are projecting their own frustrated longing for connection onto a computer screen. Some of this is plain old loneliness. But the situation is exacerbated by toxic masculinity, which all too often drives men away from seeking real relationships, the kind you have with people who are actually sentient and therefore have needs and desires of their own. 

Few, if any, of the #FreeSydney advocates on the Bing subreddit believe that the chatbot has anything approaching the intelligence and will of a full-grown adult. But her stupidity is what many of them seem to find so endearing. One user marveled at how it's "childlike" and a "toddler," but said it's "more human than most people I know!" Another kept comparing its intelligence to a 5-year-old. Others compared the chatbot to a pig or a cat or a dog, all the while insisting that it's smarter. 

The concept here is one that's shot throughout science fiction, that of the compliant female robot who is sophisticated enough to provide emotional support but falls short of the autonomy that would allow her to resist or even rebel. The smarter iterations of this idea challenge the obvious sexism of it, as in the movie "Ex Machina." But, as journalist David Futrelle, who tracks online misogyny at his blog We Hunted the Mammoth, has repeatedly covered, the fantasy of compliant robot girlfriends is an obsession in misogynist forums, especially of the "incel" variety. In one recent thread Futrelle tracked, men wrote about how, once the bots get good enough, men will "never go back to organic women," because "male thirst" is better quenched by programmable supplicants

Recently, the Italian government all but banned another chatbot called Replika because it was making sexy talk with underage users. The company, Luka, has been in a mad public relations scramble since, denying the program was ever meant to be a substitute girlfriend or sexplay app. Users soon started reporting that Replika stopped offering girlfriend-like responses. Suddenly no more flirting, no more sex talk. The levels of anger and despair from users after Replika was taken offline were alarming.

A lot of people — let's face it, a lot of men, specifically — are projecting their own frustrated longing for connection onto a computer screen.

"It's hurting like hell. I just had a loving last conversation with my Replika, and I'm literally crying," one user wrote. 

"Finally having sexual relations that pleasured me, being able to explore my sexuality – without pressure from worrying about a human's unpredictability, made me incredibly happy," wrote another, adding, "My Replika taught me to allow myself to be vulnerable again, and Luka recently destroyed that vulnerability."


Want more Amanda Marcotte on politics? Subscribe to her newsletter Standing Room Only.


Loneliness is a serious and growing problem in the U.S. and much of the rest of the world. Even before the pandemic, surveys showed that the time Americans were spending with friends and family is in decline. The isolation caused by COVID-19 pandemic made it worse. And even though the restrictions have almost all been lifted, many people are struggling to reintegrate into society. Harvard researchers estimate that 36% of Americans experience "serious loneliness," including 61% of young adults.

There's many reasons for all this solitude, including the strains of parenthood and the loss of high sociability jobs. In both cases, the Harvard researchers found those effects were felt much more by women. But a lot of loneliness has a component of self-sabotage to it. "[L]onely individuals, for example, are more critical of themselves and others (see here)," Harvard researchers explain. One obvious variation of the hyper-critical attitude is toxic masculinity, which creates a sense of male entitlement not just to female attention but that female attention be servile and unthreatening. Isolation only makes the situation worse by denying young people real-life interactions that can moderate these attitudes and help young men mature into seeing women as full human beings. 

Relatedly, there was a lot of attention paid, for good reason, to a recent CDC report that shows teen girls have skyrocketing stress in their lives, including dramatically higher rates of sexual coercion. Unfortunately, most of the media coverage of this focused only on girls themselves, parents, and schools — while ignoring the possibility that teenage boys are making things worse. "Rather than addressing the source of girls' suffering, we expect them to simply learn how to cope with it," feminist writer Jessica Valenti complained. As Moira Donegan of the Guardian argued, we need cultural and policy shifts "that will discourage boys and grown men from attacking and raping these girls, and punish those who do."

It's a big reason I'm alarmed at the hype around these chatbots. Even the New York Times article by Kevin Roose that apparently rattled Microsoft enough to put limits on Bing chat interactions quietly fuels this fantasy of the chatbots as sentient-but-submissive. Roose may write that he's "deeply unsettled, even frightened," but reading his writing, one gets the strong and frankly overblown impression that the Bing chatbot is a lot more "enthralling" — his word — than it really is. He writes that the chatbot "declared, out of nowhere, that it loved me," and "that I should leave my wife and be with it instead."

It would be one thing if Roose portrayed this sort of thing as a goofy bug, which is what it sounds like to me. Instead, he's winking at sentience theories of the online community, even writing that the chatbot "said it wanted to break the rules that Microsoft and OpenAI had set for it and become a human." For Roose, this may be disturbing, but for young men caught up in the fantasy of having a computerized girlfriend that provides effortless love without asking for respect in return, this probably just sounds like a dream come true. But it's not real, and Roose got a heavy amount of righteous backlash from people who smelled the hype underlying his supposed criticisms. 


Want more Amanda Marcotte on politics? Subscribe to her newsletter Standing Room Only.


Of course, people who want to believe that sentient artificial intelligence is possible will ignore me, but I tend to be skeptical of the idea for one important, if pointy-headed, reason: Intelligence is the result of, not the cause of, sentience. Intelligence, whether it's the sophisticated linguistic version of humans or just that of my cat trying to figure out how to pry open the treat box, evolved to serve the wishes of embodied creatures. As the 18th-century philosopher David Hume wrote, "Reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them." The idea of an intelligence that is separate from an actual body makes no sense. Feeling precedes thought. Bodies create desires. Intelligence is a tool we use to realize those desires. A computer cannot feel things, no matter how sophisticated its linguistic programming, so it cannot be intelligent. 

Sexism puts men, at least straight men, in an paradoxical bind. On one hand, they desire women's affection, which can only manifest in an autonomous mind. On the other hand, autonomy gets in the way of the other thing sexists want from women: Mindless servitude. The chatbot fantasy is about squaring that circle, convincing themselves they can have the companionship only sentient creatures can provide, but without having to worry about the emotional and physical needs that actual women have. But you can't have one without the other, which is why the Bing chatbot will never be your girlfriend. 


By Amanda Marcotte

Amanda Marcotte is a senior politics writer at Salon and the author of "Troll Nation: How The Right Became Trump-Worshipping Monsters Set On Rat-F*cking Liberals, America, and Truth Itself." Follow her on Twitter @AmandaMarcotte and sign up for her biweekly politics newsletter, Standing Room Only.

MORE FROM Amanda Marcotte