Dance Moms? Hit. Candace Cameron Bure? A definite miss. Soft butch lesbians? Sort of—at least in an aspirational sense. Wanda Maximoff? Another hit—and you’ve sunk my algorithmic battleship.
Although I’ve been researching digital culture and social media for years, the truth is that I’m fairly new to TikTok, and the app is very obviously doing its best to figure me out. As seen in the topics it’s repeatedly shown me above, that algorithm still has a lot to learn. While TikTok’s algorithms are complex and, at times, difficult to wrap one’s head around, the reason why I finally decided to download the app is simple and straightforward. A distinct niche of TikTok users have taken to sharing how the app’s personalized recommendations, through its For You Page (or “FYP”), have helped them to discover new dimensions of their identity and, in particular, their sexuality. These narratives have moved beyond the TikTok app, as they’re discussed and shared across Reddit forums, Twitter threads and even The Vergecast podcast.
I’ve always been intrigued by our online modes of self-discovery. For example, the last Buzzfeed quiz I took informed me that I’m a total Marnie (of HBO’s Girls)—mature, ambitious and not always the most useful friend to have around. Elsewhere online, a WikiHow quiz titled “Am I Gay?” concluded with an excitable yet tentative “You might be!” (Reader, I undoubtedly am.) Navigating this quiz, however, one of the 12 questions that it posed stood out to me as a true indicator that our understanding of sexuality has evolved alongside technology. “If you scroll through your feed or FYP,” the last question begins, “do you see content from queer creators?” I was immediately curious about what my own FYP might tell me. Since installing TikTok on my phone, I’ve spent hours scrolling through my new account, but I’ve only made one clear discovery: a growing skepticism about what happens when we let our devices and the corporations behind them tell us who we are.
A quick search for this emerging trend of queer folks’ algorithmic revelations turns up a number of recent articles with telling titles like “TikTok’s algorithms knew I was bi before I did,” “TikTok made me (realize I was) gay” and even “The TikTok algorithm knew my sexuality better than I did.” The stories themselves are often a bit more nuanced than their titles let on. If read in full, they shed light on TikTok’s algorithms and, specifically, the way we’ve collectively come to inject these computational operations with meaning and credibility. Perhaps you relate. After all, who among us hasn’t closely guarded a Netflix or Spotify account from a friend’s request to play a show, a movie, a podcast or a song that we’re convinced might ruin our algorithm’s carefully crafted profile? This safeguarding reveals what we value, and what we value turns out to be the algorithm’s short-term memory.
Yet, our relationships to algorithms tend to be fraught with paradoxes—and a common love-hate dynamic emerges. As Isabel Munson writes in “Mirror of Your Mind,” an article from the innovative and now-defunct tech and culture magazine Real Life, a number of issues can arise when we look to TikTok or any social media app as a reflection of ourselves. Many of these concerns are already ingrained in us: after all, we know that filters produce a false sense of reality; we know that a user’s posts represent only one, highly manufactured, version of an individual’s life. But something happens when we look inward using the algorithm’s funhouse mirror. Early in her research, psychologist Sherry Turkle found that all users—but especially younger, “digital native” generations—tend to be more open and less inhibited when we interact with machines than when we interact with other people. There are several reasons why this might be.
For starters, we often see our digital tools as less biased and more private than our real-world connections. Both of these ideas are false. Not only do social media companies frequently jump at the chance to sell our data to third-party advertisers, but algorithms designed by human programmers often reproduce the biases held by their creators. The inner workings of black-box algorithms—a term used to describe systems where we know what goes into the algorithm (data) and we know what comes out (recommendations), but we don’t necessarily know what’s happening in between those steps—further amplifies, warps and evolves these biases in new ways that aren’t yet fully understood. To bring this home, I might ask why, for example, transphobic TikToks tagged as #ChristianityTikTok and #ConservativeTikTok appear on my FYP. Why do they appear on the TikTok platform at all? What convoluted digital dark magic has promoted these types of voices online? Far from remaining neutral or unbiased, social media algorithms—wired for increased engagement—encourage users to connect content to communities through tagging practices that ultimately solidify in-app spaces for all sorts of toxic rhetoric.
For reasons like this, Jill Walker Rettberg, a digital culture researcher, proposes that we need to start looking at our algorithms with a bit more discernment. “In an algorithmic culture where we have far more data than we can possibly use,” she writes, “we need to start thinking more about how algorithms filter our content, removing or altering our data.” The analogy of filtering works well here, as it draws attention to the ways in which the FYP algorithm attempts to sift users into categories that are easier to market to and to keep engaged. Categorizing people, however, is a project that queer folks have, historically, rebelled against. The ever-expanding list of acronyms in LGBTQ2S+ as well as the catch-all “queer” are both testaments to this. We rarely fit fully or neatly into the categories we encounter. How much worse are we likely to fit when we let an algorithm conduct this categorical coding for us?
TikTok can be a useful tool. For many, it provides new modes of representation and community, and kudos to whatever steps help someone to come into their own sexuality. But a tool can only do so much. They are not only limited, but limiting. On the one hand, the WikiHow quiz stops short; I know I’m queer without hesitation. On the other hand, algorithms are often outright inaccurate in their constructions. One of my straight cis friends has a FYP dripping with queer content, but she knows (“lamentably,” she will tell you) that this doesn’t make her queer. Her ability to manoeuvre around the algorithm’s attempts to label her does, however, make her a discerning user. Rather than relying on what an algorithm says about us, it’s worthwhile and—trust me—more fun to figure it out IRL.