The admission came after the German site Netzpolitik reported that TikTok asked moderators to watch 15-second videos and decide if the creator looked like the type of person others might want to bully. If so, moderators were instructed to add flags to the accounts of these “vulnerable” users. These flags would stop their videos from being shown to audiences outside their home countries and, in some cases, would even prevent their videos from appearing in other users’ feeds. A list of flagged users obtained by Netzpolitik included people with and without disabilities, whose bios included hashtags like #fatwoman and #disabled or had rainbow flags and other LGBTQ identifiers.
Among those who found out their content had been suppressed was Annika, or “Miss_Anni21,” a 21-year-old self-described fat woman with 23,000 TikTok followers. Although Annika’s videos have attracted both positive and negative comments, she told Netzpolitik that the action was “discriminatory” and “inhuman.”
A TikTok spokesperson told NetzPolitik “this approach was never intended to be a long-term solution” and said the policies were no longer in use. TikTok also said, “while the intention was good, the approach was wrong and we have long since changed the earlier policy in favor of more nuanced anti-bullying policies and in-app protections.’” Despite TikTok’s statement, NetzPolitik has identified the rules were in place as recently as September.
The team at TikTok that developed the video suppression policy may have earnestly believed it was a helpful reaction to the scourge of bullying. Online harassment remains an intractable problem, and people with disabilities are among those disproportionately targeted. One study found that in Boston, students with disabilities were 1.8 times as likely as their peers to be victims of cyberbullying. And yet, the same study, authored by Miriam Heyman of the Ruderman Family Foundation, also found that students with disabilities were also more likely to receive support from others via social media. For members of any minority group, social media provides an opportunity to connect with others with shared experiences, to find role models and content reflecting their own life that isn’t represented in traditional media. Some even translate their reach into dollars. Aaron Philip, a disabled trans influencer, won modeling contracts with Sephora and Dove after going viral on Twitter, and Keah Brown landed a book deal with Simon & Schuster for her essay collection The Pretty One after her hashtag #disabledandcute took off.* Social media suppression denies people economic, political, and cultural opportunities and, in that sense, really isn’t that different from an employer not hiring a software engineer because they use a wheelchair.
Facebook has also come under fire for its policies toward disabled people. Earlier this year, the social network removed a video containing a sexy picture of amputee Vicky Balch on the Facebook page of Ability Access, which promotes the disabled community. A Facebook employee told Ability Access, “You will have to understand that some people see disability as disturbing.” After a backlash, the company apologized for its choice of words, although it declined to restore the video, saying that Balch’s partial nudity violated standards around adult content, not any policies related to people with disabilities.
There’s been a lot of debate about whether social media companies should “deplatform” bullies. But the solution is certainly not to deplatform the targets of those attacks.
Correction, Dec. 4, 2019: This post originally misspelled Aaron Philip’s last name.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.