TikTok, the popular social media platform, is facing scrutiny over its safeguards for child users. A recent investigation by The Guardian revealed that moderators were instructed to allow users under the age of 13 to remain on the platform if their accounts claimed to be overseen by parents. This practice raises concerns about TikTok’s commitment to protecting young users and adhering to its own minimum age restrictions.
TikTok’s inconsistent enforcement of age restrictions
The investigation found instances where accounts of users who openly declared themselves to be under 13 in their bios were permitted to stay active on TikTok. The justification provided was that these accounts were supposedly managed by parents. This advice was disseminated in a group chat involving over 70 moderators responsible for content moderation across Europe, the Middle East, and Africa.
TikTok’s internal processes for handling suspected underage accounts involve sending them to a specific moderation queue. Moderators then decide whether to ban the account or approve its continued presence on the platform. However, the investigation suggests that the mere mention of parental oversight in a user’s bio could be enough to bypass TikTok’s age restrictions.
A TikTok staff member expressed concerns about the ease with which underage users could avoid bans, potentially leading to widespread knowledge among young users about how to circumvent age restrictions. Despite TikTok’s denial of these allegations, claiming that it does not allow under-13s on the platform, the evidence suggests otherwise.
TikTok has faced regulatory challenges over its management of accounts belonging to users under 18. In September, the Irish data watchdog fined TikTok €345m for failing to protect underage users’ content from public view. Similarly, the U.K. data regulator fined TikTok £12.7m for misusing the data of children under 13.
The Guardian’s investigation also uncovered instances where potentially underage accounts received internal tags that could afford them preferential treatment. In one case, a “top creator” tag was attached to an account of a child who appeared to be underage, indicating a possible inconsistency in TikTok’s enforcement of its community guidelines.
TikTok’s approach to age limits and child safety is subject to various regulations, including the U.K.’s children’s code and the EU’s Digital Services Act. These regulations mandate platforms to implement measures like parental controls and age verification to protect children from harmful content.