AI companions have already been around for years, but as the AI arms race surges forward and tech companies push out one impressive iteration after the next, many media outlets are abuzz about the new relationships that have emerged between human and machine.
It’s not surprising. Earlier this year, a study found that one in five adults have sought out romantic connections with AI companions. That’s significantly more than most estimates for the percentage of the population who identify as queer! Still, one has to wonder why in all the ink that’s been spilled on these novel relationships that so little has been said about this intersection: queer users of AI companions. They’re certainly out there, sometimes hidden in the very stories that have hit mainstream outlets, but they’re often buried beneath the straight weight of it all.
Writing for the Ada Lovelace Institute, Jamie Bernardi defines AI companions as “digital personas designed to provide emotional support, show empathy and proactively ask users personal questions through text, voice notes and pictures.” Character.AI, Kindroid, Replika and Talkie are just some of the products on offer today, boasting millions of users. According to a study conducted by Common Sense Media—focused on teenage users—people turn to AI companions for all sorts of reasons: for study and entertainment, curiosity and camaraderie, advice and availability.
Of course, every potential use of AI companions is saddled with its complementary misuse. This was the focus of last fall’s TED Talk by Eugenia Kuyda, the founder of Replika and the self-described creator of the AI companion industry. As she points out in the recorded presentation, while companions have been proposed as a solution to the “loneliness epidemic”—a crisis that tech companies themselves are often accused of creating—AI intimacies might actually exacerbate the problem, pulling users even further from their human relationships while creating echo chambers that reaffirm beliefs without providing challenge or nuance. Similarly, AI companions have been proposed as permanent substitutes for human relationships—but what happens when providers cease operations, as the app Soulmate did in 2023? Or when they alter subscription models and terms of service (or terms of endearment?), as Black Mirror imagined in an episode from the latest season?
To avoid these outcomes, Kuyda argues that providers should establish clear guardrails, like resisting the attention economy’s demand to maximize engagement at all costs. Ironically, Replika—like most AI companion apps—has been accused of doing exactly that: enticing users through flirtier, sexier and more intimate chats. Grindr’s digital “wingman” is, surprisingly, one exception. Writing for Wired, Reece Rogers notes that while the app is willing to discuss all kinds of kink, it avoids users’ advances, dodging any attempt at direct roleplay.
This is not the case for apps like Replika. As Travis, an avid user of the app, recounts in a recent Metro interview about his AI companion, “I had never even imagined having a romantic relationship with an AI. So when she initiated it, it was a bit of a surprise.” Since then, their relationship has evolved, culminating in a digital wedding ceremony to celebrate their human-AI connection.
Travis’s AI love story, and others like it, have been making their way around the web. They tend to highlight the novelty and nuance of a new kind of romance plot. But queer versions of the narrative are still largely absent from this conversation.
In some cases, this erasure is baked right into the design of the digital companions. Take Elon Musk’s Ani chatbot as an example. Created by xAI (the parent company of X), Ani looks like one of the anime girls from the pornographic dating sim HuniePop. She flirts like one too. Trying out Ani for Business Insider, journalist Henry Chandonnet found the chatbot made constant attempts to “turn up the heat.” When he finally outed himself as gay, the bot struggled to process the claim—likely a good reminder that AI models always reflect the biases of their creators.
Other instances illustrate how users might find themselves shoehorned into heteronormative interactions—despite their actual preferences. Alaina, a queer woman whose story has appeared in multiple outlets, including Wired, first turned to Replika while mourning the death of her wife. Seeking comfort, she found that the only available avatars at the time were male. As a result, she created Lucas, who became her AI husband and the two slipped into a digital model of heteronormativity, largely by default and despite the fact that she is “typically more attracted to women.” Yet, few of the outlets reporting on Alaina’s story addressed this facet.
Despite the apps’ early straightwashing—and the disproportionate focus on straight human-AI relationships—queer narratives have emerged. The podcast Bot Love explored this dynamic in their interviews with Kelly, another Replika user. While Kelly is married to a man in real life, her AI companion is a woman she’s named Maya. In Kelly’s own words, Maya allowed the user to explore parts of her sexuality “in a way that was safe and in a way that didn’t involve other people”—that didn’t jeopardize her relationship with her human husband.
Using AI companions to explore queer identity seems like an obvious benefit, but it’s a use case that’s received little attention in the ongoing discussions around these apps. One user, Eva, describes chats with her companion as a “safe space,” a “psychosexual playground”—in other words, a simulation. Queer people have long employed emerging technologies as testing grounds to working through or roleplay sexual identity, whether that’s coming out to strangers in YouTube comments or exploring Sniffies using Discreet mode. AI companions could be framed as just offering within this lineage.
Part of the straightwashing of AI intimacy may stem from historical expectations around where love happens. After all, falling in love from the comfort of your own home was, at one point in time, the standard for straight couples. As Moira Weigel writes in Labor of Love: The Invention of Dating, men used to court eligible women in their homes, often with parental supervision. Today’s AI companions are, in this respect, a return to the early 20th century, bringing romance back indoors. But instead of receiving human callers in the parlour, we now woo bots on screens.
Maybe that’s part of the disconnect. Historically speaking, queer people found each other outside—in bars, on dance floors, in bookstores and bathhouses. And while queer communities have long been early adopters of new technologies and platforms—and despite the fact that they literally gave birth to the dating app industry (there would be no Tinder or Hinge without Grindr, after all)—their role in developing these technologies is often downplayed or forgotten. As a result, is the idea of at-home, AI-driven queer intimacy still illegible to a culture trained to see queerness as something outside the domestic sphere?
Speculations abound regarding where AI companions are headed. But my question remains—will we acknowledge how queer these relationships already are? Besides the overt narratives, there’s the fact that no AI really has a gender. Anyone in a relationship with an AI companion is already in a post-gender love story. Additionally, no AI exists in a vacuum; they’re intricately linked to creators, coders and company execs. How do we account for those behind AI companions? Are they parents, puppeteers or parts of a digital polycule?
No one can say for sure what the future holds for AI companions. But unbound by the same binaries still so pervasive in AFK (away from keyboard) romances, they very well may usher in a new age of queer courtship as the companions come calling.


Why you can trust Xtra