If TikTok wasn’t explicitly designed to corrupt America’s youth by flooding their developing brains with videos about sex and drugs, then we could certainly be forgiven for suspecting that it was.
According to an in-depth analysis of TikTok’s personalized “for you page” (where users encoeeunter most of the content they consume on the app), WSJ discovered that accounts of users as young as 13 were being fed videos about sex, kinks, porn, drug use and other topics that might unsettle parents and put ideas in the immature child’s head.
To carry out this analysis, WSJ created several “dummy” TikTok accounts and then monitored them as they scrolled through TikTok’s endless feed, stopping to look at videos with inappropriate themes, or marked “for adults only”, along the way.
The WSJ feeds saw a stream of inappropriate content, including recommending paid pornography sites (OnlyFans, if we had to guess) and sex shops.
Still other videos encouraged eating disorders and glorified alcohol, including videos that made lighthearted jokes about drinking and driving and about drinking games. Every time it found an inappropriate video, WSJ catalogued it.
By the end of the project, WSJ had sharedv a sample of 974 videos about drugs, pornography and other adult content that were served to accounts purportedly controlled by minors, with hundreds of inappropriate videos shown in quick succession.
While some of the videos WSJ shared were deleted, most were left undisturbed.
An underage woman who spoke to WSJ said “I do have in my bio that is 18+ but I have no real way to police this,” she wrote in a message. “I do not agree with TikTok showing my content to someone so young.”
In one stretch of videos viewed by the 13-year-old account, half of nearly 200 videos were marked adult-only, but they were served up to the account anyway.
A TikTok spokeswoman told WSJ that that its bots “in no way represents the behavior and viewing experience of a real person,” in part because humans have diverse and changing interests. She added that the platform was “reviewing how to help prevent even highly unusual viewing habits from creating negative cycles, particularly for our younger users.”
But WSJ’s analysis discovered that the primary mechanism that feeds inappropriate content to minors on the app is triggered simply by lingering over a certain video for a long time – perhaps by watching it twice. Or simply making it to the end once. Lingering on videos of inappropriate content (a behavior that mimics authentic teenage activity) caused more to be served up.
TikTok has also struggled to purge posts promoting eating disorders, drug abuse. One of the most extreme rabbit holes that one of WSJ’s accounts wound up in involved videos about rape fantasies, how to recover from violent acts, and how to tie knots for sex. At one point, 90% of the accounts feed was about bondage and sex.
Over the course of several months, WSJ created 100 of these dummy accounts, listing their ages between the ages of 13 and 17.
We’d love to know: Does China allow ByteDance to serve inappropriate content to children in China?
Read More