TikTok said it removed more than 7 million accounts belonging to children under 13 during the first quarter, taking a significant step in forcing the required age of users on the app. More than 11.1 million accounts were removed for violating the appβs guidelines. 7.26 million of them were from users suspected of being under the age guidelines, the company said in a blog post on Wednesday.
In the US, internet sites are required to take parental permission before collecting data on children under 13, according to the Childrenβs Online Privacy Protection Act. But many kids cover up their age and create accounts anyway across social media sites from Instagram to YouTube. TikTok, which ByteDance Ltd. owns, has done particular studies because of the large amount of data its experienced algorithm collects and its popularity, especially among young people.
In 2019, TikTok was forced to pay the US Federal Trade Commission a record $5.7 million fine for unlawfully collecting childrenβs data, including names, email addresses, and locations of children who used the app.
Since then, TikTok has added several features to make the platform safer for its fans. About two years ago, it launched a dedicated section on the app for children 12 and under called TikTok for Younger Users. Space offers βa curated viewing experience with additional safeguards and privacy protectionsβ, according to the company.
Live