To get Xtra Weekly in your inbox, subscribe here.
WEEKLY EXPLAINER
TikTok censored LGBTQ2 users, body positivity advocates and users with disabilities, according to documents obtained by the German site, Netzpolitik.
Here’s the background On Monday, German digital culture blog Netzpolitik reported that Chinese-owned social media platform, TikTok, censored content produced by people with disabilities, body positivity advocates and LGBTQ2 people as recently as this past September as a measure to “protect” special users—a group that the company identifies as people who are at risk of online bullying and harassment.
Netzpolitik obtained documents detailing instructions sent to the platform’s moderators to identify and mark content produced by “special users.”
TikTok advised moderators to look for a “subject who is susceptible to bullying or harassment based on their physical or mental condition. Example [are people with] facial disfigurement, autism, down syndrome, disabled people or people with some facial problems such as a birthmark, slight squint, etc.”
The level of censorship varied depending on the user. For instance, the group above is classified by TikTok as “risk 4” and the content they produced was only visible in their home country. Other users’ content was removed from the platform’s search algorithm—limiting their audience reach—after obtaining a certain number of views (between 6,000 and 10,000) and are marked as “Auto R.”
According to Netzpolitik, a striking number of people who end up with an “Auto R” designation have “a rainbow flag in their biographies or describe themselves as lesbian, gay or non-binary.”
Now what? TikTok admitted to censoring but said it was used as a temporary fix to address cyberbullying. An unnamed spokesperson from the company told Netzpolitik, “This approach was never intended to be a long-term solution and although we had a good intention, we realized that it was not the right approach. The rules have now been replaced by new, nuanced rules. The technology for identifying bullying has been further developed and users are encouraged to interact positively with each other.”
The company also released a follow-up statement to The Guardian saying, “Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy. While the intention was good, the approach was wrong and we have long since changed the earlier policy in favour of more nuanced anti-bullying policies and in-app protections.”
Earlier this year, YouTube was sued by content creators after the company censored videos produced by LGBTQ2 users. However, the company claims that its censorship of “borderline content”—videos that are in-between what is acceptable and what is not based on the platform’s terms of service—is actually working.
According to a report by The Verge “more than 30 changes have been made to the way videos are recommended since January 2019.” When asked for details, the company wouldn’t say what those changes are or provide any actual numbers that prove the effectiveness of its content crackdown.
As for TikTok, this isn’t the only controversy the company is facing. It’s being sued for allegedly collecting and exposing data of children under the age of 13.
Wait, there’s more! Click here to subscribe to Xtra Weekly for roundups of LGBTQ2 news, culture and stories.