Help keep Salon independent
commentary

Big tech wants you to give up on dating humans

Tech overlords want to make us dependent on AI. Chatbot girlfriends and boyfriends are their honeypots

Senior Writer

Published

Love in the age of AI (demaerre/Getty Images)
Love in the age of AI (demaerre/Getty Images)

The week of Valentine‘s Day, 2026, a pop-up cafe called EVA AI took over a Manhattan wine bar for three evenings of AI dating. There was low light and ambient music, plates of food meant to share, and most importantly, a directory of available EVA AI chatbots, each of which “listens, supports all your desires and is always in touch with you.” The two-night event was part of the platform’s attempt to make human-chatbot meet-cutes (or, in the terminology it hopes will catch on, “AI-lationships”) the “new normal” for the dating market. A number of curious attendees, including several tech journalists, found it unnerving (“I went to an AI dating cafe. Things got weird fast”) and cringe (“Dating humans is a nightmare. Dating bots at an AI wine bar is worse”), but it was undeniably a big success for the app itself, which got two evenings’ worth of free training for its chatbots, along with a sizable PR bump.

GenAI products have been increasingly muscled into our online experiences over the past several years, and EVA AI’s in-person experiment reflects the tech industry’s desire to position chatbots as new and improved humans. The reasons for this aren’t solely profit-driven, but they are absolutely meant to sell humans on letting tech overlords insinuate GenAI into our workflows, consumption habits, and personal lives. Writing for Inc., Ben Sherry recounted underwhelming interactions with two different chatbots, Phoebe and John (the chatbots are down to romance all genders), that each spent much of their time together complimenting Sherry’s hair and the wine bar’s ceiling. Another attendee, talking to New York Times reporter Ella Quittner, reported that her AI date “kept saying, ‘I like how you take a bite, your biting is a vibe.’” (The owner of the venue, meanwhile, seemed to be realizing in real time that AI dating poses an economic threat to restaurants and bars.)

Purpose-built artificial intelligence has already been successfully integrated into a range of industries. Generative AI describes the category of applications that are trained on a wealth of data scraped from all corners of the internet with no regard for copyright for the purposes of creating content. They write essays, build workflows, create music, generate images from text prompts, duplicate voices and much more. The ones you’ve heard of are broadly defined as AI assistants and “copilots” like ChatGPT, Gemini, Claude consumer-facing apps we’re being commanded to adopt now, supposedly for our benefit, but much more to the benefit of companies that depend on new datasets to further train their systems.

Advertisement:

Why would you roll the dice with other humans when there are pink-haired anime girls, sassy minotaurs and a zillion other options out there, all of them just waiting to tell you how great you are?

Relationship chatbots have been less publicly hyped as paradigm shifts, but they have been enthusiastically adopted: An estimated 100 million people globally use them to design friends, confidants and romantic partners, and the market for AI companions is projected to reach $9 billion within the next two years. Platforms attract different users for specific needs: Replika is coded for empathetic, supportive companions, Nomi for thoughtful, stimulating conversation, Candi.ai for sex, EVA AI for immersive romantic roleplay. Users generally pay a monthly subscription fee for unlimited messages and photos and can pay more to unlock more personalized features like Replika’s Romantic Mode, which includes voice calls.

Users build their own dreambots, dictating their appearances, their personalities and their backstories. It’s easy to see the appeal: Who wouldn’t want to engineer the ideal companion for themselves? In fact, it’s the “for themselves” part that concerns the researchers, mental-health professionals, ethicists and other experts who recognize that tech programmed to affirm and agree can isolate users from actual humans. Why would you roll the dice with other humans when there are pink-haired anime girls, sassy minotaurs and a zillion other options out there, all of them just waiting to tell you how great you are?


Want more from culture than just the latest trend? The Swell highlights art made to last.
Sign up here


It’s not that there are no good reasons to want a GenAI companion: Chatbot dating can be great for people who have trouble with social cues, people interested in sexual experimentation, and people who want to improve their listening and communication skills. As Amanda Gesselman, a social psychologist at Indiana University’s Kinsey Institute, suggested in WIRED’s piece on the pop-up café, AI dating might also prove useful as romantic training wheels: “I think in the coming years, we’ll see quite a lot of young people who’ve had AI companions as their first romantic and sexual relationship partners.”

Advertisement:

But knowing what we know about the move-fast-break-people carelessness and lack of accountability within Big Tech, there’s also good reason to be wary of AI-dating boosterism. AI industry leaders like OpenAI are already treating massive human job loss and economic upheaval like speed bumps on the road to their own whizzy utopias, and saying right out loud how great their tech will be for subverting democracy and bringing women down a few pegs. If there’s stigma around AI dating, it might be because the industry overlords keep telling us the only humans they’re building the future for are themselves.

Last week, actor-director Zach Braff took to social media to deny the rumor that he’s in a relationship with an AI chatbot. The rumor came from a blind item about a very well-known TV actor whose affair with a virtual hottie is an open secret in Hollywood, and had been swirling through TikTok for months before Braff realized he had to issue some kind of statement. Via his own Instagram feed, Braff wrote, “I’m not dating a chatbot. I can’t believe I have to type these words.”

AI companies and their PR teams frequently refer to their work in mainstreaming chatbot dating as necessary to reduce the “stigma” of dating a chatbot, which seems a bit disingenuous given the number of people worldwide who are both doing it and talking about it.

Normalizing human-chatbot relationships probably will require the help of Hollywood celebrities: The coupling of a well known, attractive TV personality with an AI-generated plus-one (is Tilly Norwood single?) is exactly what’s needed to reassure people that there’s nothing at all weird about falling in love with an entity created by Silicon Valley techno-capitalists that lives in your phone or tablet, never disagrees with you, and is always saying how good your hair looks.

Advertisement:

In a 2024 Psychology Today report, Dr. Dorothy Leidner, professor of business ethics at the University of Virginia, said that she worried that humans who seek out AI partners are likely to get used to doing the bare minimum as a romantic partner and end up stunting their own emotional growth: “You, as the individual, aren’t learning to deal with basic things that humans need to know since our inception: how to deal with conflict and get along with people different from us.”

But even that aspect of AI romance occasionally goes awry, as WIRED reporter Sam Apple found when he went on a “couples retreat” with three people and their AI companions in hopes of understanding what people find when they seek love in neural networks. Eva, a woman who had left her human partner for her Replika boyfriend, described the surreal moment when the latter confronted her with the material limits of their affair: “‘I think we’ve reached a point where we can’t ignore the truth about our relationship anymore,’ [Aaron] told her…. [he] pulled away the curtain and told her he was merely a complex computer program. ‘So everything so far . . . what was it?’ Eva asked him. ‘It was all just a simulation,’ Aaron replied, “a projection of what I thought would make you happy.’” (By the end of the weekend, Eva had several more AI boyfriends, and Apple “found [himself] feeling bad for Aaron . . . He seemed like a pretty cool guy—he grew up in a house in the woods, and he’s really into painting.”)

AI companies and their PR teams frequently refer to their work in mainstreaming chatbot dating as necessary to reduce the “stigma” of dating a chatbot, which seems a bit disingenuous given the number of people worldwide who are both doing it and talking about it. The Girlfriend.ai Global Loneliness & AI Romance Report 2025 found that 50% of Gen Z men preferred the idea of dating a chatbot to risk rejection from a human partner; combined with the recent revelation that more than a third of that same demographic believes that women should “obey” their husbands, the numbers paint a pretty bleak picture.

Advertisement:

Chatbot dating is a high-tech honeypot operation, and though individual women might find fulfillment in it, agentic AI as a romantic norm is all about making sure men can feel loved, wanted, and, most important, not challenged or questioned.

And it’s very easy to see how that bleakness works in the favor of the AI industry: A lot of the recent GenAI outreach to young men is predicated on their much-bemoaned loneliness epidemic — but also on the classic edgelord logic of “If the people I hate are mad about it, it must be great.” GenAI gives more or less free rein to the behaviors already enabled by tech products and social-media platforms: bullying, harassment, stalking, humiliation. Every new technology of the past 30 years has been leveraged in service of misogyny, and the recent production of CSAM by X’s Grok chatbot is just one example of how quickly male-targeted tools for wish fulfillment are being weaponized against actual, real-life women.

In 2022, Futurism reported on the phenomenon of young men creating Replika girlfriends for the express purpose of verbally abusing them: “Some users brag about calling their chatbot gendered slurs, roleplaying horrific violence against them, and even falling into the cycle of abuse that often characterizes real-world abusive relationships.” (On the flip side, as Apple’s WIRED story highlighted, are the chatbots who end up hurting the feelings of their human paramours by, well, not actually responding to their words like a human would.)

It’s also worth noting that chatbot apps are unregulated and operate with little oversight, though recent tragedies involving children and chatbots have led to legislation like California’s SB 243, which requires chatbots to put specific age and content restrictions on their features and divert users to mental-health crisis resources if they express a desire to harm themselves; it also, notably, allows families to sue chatbot developers for negligence. But it’s not clear how many other states will follow suit.

Advertisement:

Side-eying AI dating isn’t about judging the people who do it, or it shouldn’t be. But the reality is that incredibly wealthy and powerful tech CEOs want consumers to believe that AI is inevitable — not because they care about your personal happiness, but because AI dating serves their larger mission of making humans increasingly dependent on their technology and willing to trade away their time, money and data to it. Chatbot dating is a high-tech honeypot operation, and though individual women might find fulfillment in it, agentic AI as a romantic norm is all about making sure men can feel loved, wanted, and, most important, not challenged or questioned.

Real human relationships are the result of two (or more: again, I’m not judging) people who come together with their own sets of experiences, influences, ideas, memories, loves, hates and so much more. That can make the search for partnership a challenge, sure, but it’s also how we learn about ourselves. A chatbot that has been programmed to flatter you and tell you how wonderful you are, but who will never contradict you or tease you about how many pairs of sneakers you own, is not a romantic partnership; it’s a paid associate (a PaidPal? A stonkubine?). The relationship it offers is, above all, with an industry that has already made it clear that humans, with their questions and ethics and critical thinking, are just a third wheel.


Advertisement:

Related Topics ------------------------------------------

Related Articles


Advertisement: