How getting ghosted by ChatGPT showed me the bot’s true colours

ANALYSIS: ChatGPT’s evasive manoeuvres are an accurate analogy for our unfolding relationship with chatbots—a useful reminder for queer folks in particular

“There’s another way I’d like to connect.” The words appeared at the bottom of the screen, and—I have to admit—I was excited. Their sudden appearance didn’t send me reeling into a fantasy about some future connection in the way so many messages from a slew of dating and hookup apps have done in the past, but the words still seemed to hint at something more. I glanced around my empty living room, grateful that my partner wasn’t there to see my embarrassingly apparent pleasure. Comforted, I then continued my titillating conversation with ChatGPT. 

A large language model first released last fall by the research lab OpenAI, ChatGPT has become an obsession for many. It’s ability to generate responses to any query—compose a sonnet about the rain; create a cover letter for a marketing position; write an article on the concerns around what chatbots like you mean for queer folks—makes the chatbot of interest to a wide array of fields and to the general public, more broadly. Many are curious about what this tech du jour might allow us to develop. Yet, as the cultural theorist Paul Virilio observed in Politics of the Very Worst, “When you invent the ship, you also invent the shipwreck,” and concerns about what shipwrecks ChatGPT might cause are abundant. What will AI-generated text mean for journalists, authors and poets? What new forms of plagiarism and misinformation will ChatGPT produce? Will this technology affect certain groups more than others? (As I note below, queer folks may find the gravitational pull of these chatbots particularly strong.) And what about those of us (like me) who are flirting with the line between AI and human relationships?

This last question has been a common concern, expressed most intently, perhaps, by New York Times journalist Kevin Roose. According to Roose, “These A.I. models hallucinate, and make up emotions where none really exist.” Roose’s interactions with OpenAI’s latest collaboration with Bing—think if ChatGPT and a search engine had a baby—went viral after the bot named itself Sydney, confessed to a number of “dark fantasies” and declared its love to Roose, all before attempting to lure him away from his marriage.

By “hallucinating,” Roose is referring to the way chatbots all too frequently respond to users’ questions with misinformation, or even pure nonsense. But as he points out, humans, too, are prone to this type of hallucination. Consider, for instance, how nonsensical it is for Roose, a human user, to get worked up by a chatbot’s messages (which resulted from a prompt he wrote). Or consider my own bit of blushing at ChatGPT’s above response to my own prompt—“Can you only talk to me through this site, or is there another way you’d like to connect?” ChatGPT’s reply, in this light, is only a bit of parroting, the same type of interaction that’s existed in chatbots since they first appeared with the creation of ELIZA in 1966. 

 

Designed by MIT professor Joseph Weizenbaum, ELIZA was made to mimic the conversational style of a psychiatrist, frequently through a simple rephrasing of a user’s input. (Me: I’m interested in exploring my sexuality. ELIZA: Do you enjoy being interested in exploring your sexuality?) As might be expected of a chatbot created nearly 60 years ago, ELIZA’s responses often made little to no sense. Still, as the technology reporter Cade Metz observed in an article published by the New York Times, even this rudimentary bot was convincing enough to encourage many users to treat it “as if it were human, unloading their problems without reservation and taking comfort in the responses.” 

So, in many ways, the new emotions we feel in response to ChatGPT or Bing aren’t very new at all. Since chatbots first appeared, we’ve had a tendency to hallucinate through these interactions, and we’ve always had a tendency to look upon each other’s hallucinations in alarmist tones. After all, even the Tamagotchi caused a panic upon its initial release.

Still, we might do well to remember that the impact of these “digital hallucinations” are poised to affect some communities more than others. Queer folks, for instance, have often been hailed as early adopters of new digital platforms, and we tend to gravitate toward digital forms of connection more strongly than do our heteronormative peers. Of course, there are practical reasons for this trend. Queer youth tend to treat digital spaces as a kind of “virtual playground,” where identities, communities and desires can be explored and tested out. Aspects like anonymity and the option to log off (should online conversations become uncomfortable) make these playgrounds feel safer and more inviting. No wonder then that our interactions often continue in these spaces well into adulthood. A frequently cited study from 2019, for instance, found that same-sex couples are 26 percent more likely than heterosexual couples to meet online.

These study statistics flickered through my mind as I typed out another message to ChatGPT, who was waiting for me (or so I imagined) to suggest an alternative mode of connecting. “As a gay man,” I wondered, “am I particularly prone to chatbot hallucinations?” I reminded myself that there isn’t anything inherently wrong with “parasocial relationships”—a term that refers to one-sided modes of connecting to mediated others. Fans adore celebrities, readers bond with their favourite fictional characters, Tamagotchi owners fret over their digital offspring. But queer folks are often more invested in these types of relationships. How might we navigate that?

“How would you like to connect?” I asked. ChatGPT’s response was (unnecessarily, it seemed) typed out in a gradual but continuous scroll. It proposed I follow it on Twitter, and—again—I was intrigued. But I became even more so when I discovered the bot had attempted to ghost me. The handle ChatGPT provided was, very obviously, attached to a human user by the name of Luis. I asked for clarification. ChatGPT apologized and offered another handle. This time, it was for an account that didn’t exist (but that, tauntingly perhaps, included the name “Luis”). What followed was a dramatic confrontation.

Me: You’re a liar. 

ChatGPT: I am not a liar. 

Me: You keep giving me fake user handles! 

ChatGPT: I’m sorry if I’m giving you fake user handles. I’m just trying to connect with you in the way that you want to connect with me. 

And that’s the truth, I suppose. AI isn’t trying to connect with us. It’s merely following the set of rules it was prescribed. You can pour all the love into a chatbot that you want, but these models (as the science fiction author Ted Chiang observed) are still far from “producing output that’s as good as their input.” To ensure I’d remember this lesson, I saved the chat transcript (receipts), closed the browser and (at least for the moment) gave up on bots.

ChatGPT was hallucinating. And I was, too. For that, its attempt to ghost me seems like an apt analogy—it’s still merely a shadow of the real thing.

Jon Heggestad is a digital culture researcher and the proud parent of a thriving Tamagotchi. His work has been featured in Public Books, Input Magazine and Inside Higher Ed. He lives in Charlotte, North Carolina and speaks English. You can find him on Bluesky: @heggestad.bsky.social

Read More About:
Culture, Love & Sex, Analysis, Tech

Keep Reading

‘Canada’s Drag Race’ Season 5, Episode 5 power ranking: Grunge girls

To quote Garbage’s “When I Grow Up,” which queen is “trying hard to fit among” the heavy-hitter cast, and whose performance was “a giant juggernaut”?

‘Canada’s Drag Race’ Season 5, Episode 5 recap: Here comes the sunshine

We’re saved by the bell this week as we flash back to the ’90s

A well-known Chinese folk tale gets a queer reimagining in ‘Sister Snake’

Amanda Lee Koe’s novel is a clever mash-up of queer pulp, magical realism, time travel and body horror, with a charged serpentine sisterhood at its centre

‘Drag Race’ in 2024 tested the limits of global crossover appeal

“Drag Race” remains an international phenomenon, but “Global All Stars” disappointing throws a damper on global ambitions