Rachel KaserInternet Culture Writer
Rachel is a writer and former game critic from Central Texas. She enjoys gaming, writing mystery stories, streaming on Twitch, and horseback Rachel is a writer and former game critic from Central Texas. She enjoys gaming, writing mystery stories, streaming on Twitch, and horseback riding. Check her Twitter for curmudgeonly criticisms.
TikTok, the social media sensation du jour, has apparently been hobbling users who have physical disabilities. It did so in a misguided attempt to shield these people from bullying — heavy emphasis on the word “misguided.”
According to a report from Netzpolitik, leaked documents reveal TikTok had special rules in place for those with some visible or obvious form of disability or disfigurement. These users include user with “autism… Down syndrome… facial disfigurement… [and] disabled people or people with some facial problems such as a birthmark, slight squint and etc.”
I say “visible or obvious” because TikTok groups such individuals under the heading “a subject highly vulnerable to cyberbullying.” Another part of the leaked document obtained by Netzpolitik explains that posts from users likely to “incite cyberbullying” will be “allowed, but marked with risk tag 4.” Posts marked with this tag by moderators would only be shown to users from the uploader’s own country, and wouldn’t be added to TikTok’s algorithmically sorted For You feed. So any users with unusual traits would be forcibly limited in their reach — and for what TikTok perceives to be their own good, no less.
So… wow, there’s a lot to unpack here. For starters, saying a person is likely to “incite cyberbullying” by virtue of something that isn’t their fault and which can’t be changed is some prime victim blaming. And the moderators are supposed to make this value judgment within 30 seconds, according to Netzpolitik’s anonymous source. How the heck are you supposed to know a person is on the autism spectrum after watching 30 seconds of them lipsyncing to Old Town Road? I’m sure someone has a nasty comment loaded up in their internet troll cannon ready to go, but unless the person explicitly says they are on the spectrum, a moderator would just be going by what they’re pre-conceived notions of how such a person looks or behaves.
This sounds like something that you’d accuse TikTok of in ignorance of how its algorithm works, except TikTok’s parent company ByteDance actually acknowledged it. A spokesperson told Netzpolitik that these rules were intended to protect vulnerable users from being cyberbullied, but were “never intended to be a long-term solution” and that this blunt force approach has since been altered. We’ve reached out to ByteDance to find out what the new rules entail.
TikTok’s already got a bit a reputation for, at best, “nannying” its users. This usually takes the form of censoring users who express certain political opinions — see also, the makeup artist who was suspended for attempting to call attention to the Uyghur Muslim concentration camps in China. Her account has since been reinstated and the suspension blamed on a “human moderation error.”
(via The Verge)
Get the TNW newsletter
Get the most important tech news in your inbox each week.