Your cursory scroll on Instagram may result in a bikini pic there, implied nude here, sultry thirst trap or three. This type of skin-forward, naturalist content feels like standard fare in the modern social media landscape, but increasingly folks who inhabit marginalized identities have found themselves being booted from the apps for posts that wouldn’t make a Kardashian blink twice.
Social media, particularly Instagram and TikTok, seem to have some major issues with folks who are fat and/or queer showing their assets in ways that abide by community guidelines. Picking up speed in 2019, many creators and artists have found posts deleted, accounts disappearing from search results and, in the worst-case scenarios, completely deleted accounts.
As a fat and queer influencer myself, I know the feeling of concern using the apps both for work and leisure. I have seen peers’ accounts disappear, as well as noticed that someone who seemed to be in my feed often suddenly vanished. Whenever this occurs, I worry that I could be next. It feels as though there are nebulous forces at hand in the algorithm—picking and choosing at a whim who to knock off the app next. Or at least, to deprioritize their content. Or at least, to use internal programming to extremely limit the views on their content.
And this slight paranoia of mine isn’t just gratuitous concern—on the contrary, it is a result of witnessing the ways that marginalized folks are increasingly targeted on social media.
Megan Ixim, aka @msgigggles, a queer fat entrepreneur, activist and influencer from Jersey City went through the unthinkable with her Instagram audience last fall. Her Instagram account stays filled with bikini photos, bright outfits, fat joy and quippy body-positive captions. However, in September 2022, Ixim logged on to Instagram to discover that her account was about to be deleted.
“I had not had a single post or Instagram warning regarding any of my content being taken down or anything going against guidelines for months,” Ixim says. “And so all of a sudden—I remember very specifically because I was working, and so I was on Instagram and it was, like, a Tuesday morning—and all of a sudden I was forced out of my account and told that it was deleted.”
As a queer fat creator whose posts would focus on body and sex positivity, plus-size fashion and empowering women, Ixim had previously had her account deleted. But in the past, the account was always reinstated. In this case the prognosis was grim. “The next day, I tried to pull every tool out of my tool shed to try to contact people, to try to find contacts through Meta [Instagram’s parent company] because sadly, there is no recourse that actually works in regard to the system that Instagram has,” Ixim says. “When your account is deleted, there’s no one you can directly talk to. You have to hope that someone stumbles upon your case and the likelihood of that is zero to none, especially when you exist in a marginalized body. There is no one rushing to help you. One of the ways that I found is really being loud and really being as proactive as you can.”
Ixim was surprised at how swiftly Instagram seemed to be getting rid of the community she’d spent eight years building.
“It literally was like, ‘Oh, your account will be deleted in 30 days.’ Like, no warning, nothing.”
Ixim knew she hadn’t had any recent content flagged or taken down, but this warning from Instagram seemed very dire.
Through connections, and her steadfast community, she was able to get her account reinstated one day before it was set to be deleted. And Ixim is well aware that her work on Instagram lives under the threat of this happening again.
“The problem with Instagram is that engagement increases through obvious responses, reports, things like that. And people love to attack people in marginalized communities and marginalized bodies. There are not enough safety precautions for protecting the accounts that are not creating content that goes against guidelines, but that might receive more hate and more reports due to that content,”
she says. “I just feel like I’m constantly reminded that the internet doesn’t want people like me on it. And the important thing is really leaning into community and understanding that you’re here for your specific people, you’re not here for everyone.”
Ixim has to consider the possibility of her platform being removed yet again.
“I … have to worry about just having my platform completely deleted in a blink of an eye.”
Knowing you are a niche (read, not thin, straight and/or white) creator with marginalized identities means social media has an added layer, contributing to vigilance and possibly caution in what you post.
While maintaining a space where strangers can target and berate creators like Ixim, the daily landscape is a lot for one person to carry. Queer creators, especially ones who are fat, trans and/or racialized can be subject to a bevy of harassment alongside censorship for posting content that is acceptable for more mainstream creators. At times, responding to harassment on social media can lead to the person being harassed getting reprimanded.
“It’s something that has really affected me in the past years where I feel like if I create content and someone sends me horrific things and I happen to defend myself, I have a much more likely chance of getting deleted than that person does,” Ixim says. “And I think that’s the thing. It’s like, how much hate should one individual accept?”
Nyome Nicholas-Williams, aka @curvynyome, a Black plus-size model, has had her own share of frustrations with Instagram. In July 2020, she shot a portrait with the photographer Alex Cameron. Nicholas-Williams, who is plus-size, was topless in some of the photos—posed and pictured in a way that many celebrities are in promotional imagery, in a way that many of her white and thin peers would feel very comfortable posting. After attempting to post her favourite shots from the shoot and getting a “violation of community terms” pop up, Nicholas-Williams turned to Cameron to find out if the same was happening to her. Surprisingly, despite Cameron, a white woman, having posted similar imagery in the past—the photos of Nicholas-Williams were taken down by moderators on both accounts.
“We couldn’t understand why my body was being censored when I wasn’t naked. I wasn’t being vulgar. I was just showing my body in its entirety and its duty,” Nicholas-Williams says.
Nicholas-Williams rallied with activist, writer and speaker Gina Martin as well as Cameron and after gaining quite a lot of momentum through social media, they were able to sit down with Meta.
“The change that we made so that images that are artistic in nature wouldn’t be taken down based on no, like, nipples showing or vaginas, et cetera, showing. So we changed that aspect of it and how they review boob covering. It’s very small, but it’s very big at the same time,” she says.
It was a very exciting time—Cameron and Nicholas-Williams even created a new commemorative image to celebrate the changes made.
“After we changed the policy and whatnot, things got better. Peoples’ images were staying up, and they were super thankful,” Nicholas-Williams says.
Instagram maintains that people are not being “shadow banned,” which is when accounts or content are actively suppressed in the algorithm from being seen by users.
“Now the issue is shadow banning. So it’s gone from images, certain images being taken down, more so to people being shadow banned completely,” says Nicholas-Williams. “So one thing has changed, and now the shadow banning of people, even though Adam [Mosseri, the head of Instagram] continually says that there’s no shadow banning on this platform.”
Meta, which also owns Facebook, admits that artificial intelligence is used on their platforms for proactive detection, automation and prioritization of content that is deemed inappropriate. “As our Community Standards Enforcement Report shows, our technology to detect violating content is improving and playing a larger role in content review.
“Together, these three aspects of technology have transformed our content review process and greatly improved our ability to moderate content at scale. However, there are still areas where it’s critical for people to review.”
Ixim says when her account got taken down, a contact she had at Meta showed that the deletion might have been something as minor as a fluke.
“And so this person immediately pulled up my account and was like, ‘Well, this is really odd because you have no violations currently, and it seems like your account was deleted on a fluke or something internally that just deleted.’”
It’s more than destabilizing to realize that your audience, your work and your community could disappear in a flash if AI decides that you are no longer respecting community guidelines, and you have no human contacts to reach to appeal the deletion.
When you factor things outside of your control—such as being trans, fat, BIPOC or a combination of the three, building a career on social media can present a precarious reality, even with a large and supportive community.
For Ashleigh Nicole Tribble, a NYC-based designer and sex coach, a split-second decision to feature on her Instagram a same-sex Sims couple (The Sims is a series of life simulation video games), who were two fat black femmes implied to be nude, but underneath covers, would end up affecting her career in unexpected ways.
“They took the post down, and they said that the post was flagged for solicitation of adult services, or something like that. Or like adult sexual services. And then, like, 10 minutes later, the whole page is gone.”
Tribble’s account of 10 years, with 100,000 followers, simply vanished, thanks to two animated avatars.
“I remembered I had my backup account. So I was asking people if they could kind of reach out and do the same thing I was trying to do. And it kind of went on throughout the summer. And I eventually got an email back from them telling me that they’re not going to restore my account and that I had broken the community guidelines.”
Tribble noticed that it seemed that certain creators were able to get their accounts back quickly, “if they’re not specifically a certain intersection or marginalization.
“The focus seems to be on a lot of suppressing the ‘sexual content’ on a platform that said that it was a big deal that they were going to be allowing people to see female nipples coming up. I’m still confused about that,” Tribble laments.
So, many of us are left wondering: now what? Where do queer, fat and marginalized creators move forward and feel safe on Instagram and other platforms?
For starters, everyone I spoke to agrees that more access to human appeals versus automation would help mitigate some of the frustrations.
Tribble says, “For the most part, I would think that it should be that the process shouldn’t be automated. I firmly believe that the flagging and the moderation is automated, I think it’s not a person because I’ve spoken to a couple of people about it and nobody really knows whether or not it’s, like, a team or just, like, a program.”
Ixim says, “Creating outside boards for reviews, having external people that can consult, and having contacts, or marginalized communities where they know that they can directly talk to someone and have a voice that can speak for them, which I don’t think is happening.”
Marginalized folks doing work on social media just want to be able to access fair and equal treatment, much like we do moving through the world.
“I’ve said this over the past, like, three years, since the campaign, but ultimately, I just want the platform to be accessible for everyone; everyone to have the same level of playing field,” says Nicholas-Williams. “Would I say to just be able to express themselves and be themselves and not have to worry about their images being taken down or censored? It’s just ridiculous that it’s still happening in 2023.”