TikTok Must Pay a Record-Setting $5.7 Million Fine for Violating Child Privacy Laws
The FTC declined to say how many minors were actually using the app.
At $5.7 million, TikTok — formerly Musical.ly — was just handed the largest civil penalty ever obtained by the Federal Trade Commission in a child privacy case. FTC explained Wednesday that, as an app directed at children, the company failed to comply with certain statues outlined by the Children’s Online Privacy Protection Act, known as COPPA.
“Companies cannot turn a blind eye to children on their services,” said Bureau of Consumer Protection Director Andrew Smith on the call. “It is not enough to say that your service is meant for children over 13.”
There’s a few ways that TikTok failed to comply with child privacy protections. The TL/DR? According to the FTC, the company failed to act on calls from angry parents requesting to remove their children’s profiles.
These angry phone calls were important, FTC officials explained, because they indicated that TikTok had “actual knowledge” concerning its large population of under-age users. If you run an app with large numbers of underage users, then, you’re required to notify parents and request their consent before harvesting any of that juicy, juicy data about those ever-covetable emerging demographics.
How TikTok Broke the Law
Confusingly, the FTC declined to say how many minors were actually using the app, saying that such data was not “public information.”
But officials explained that a number of criteria go into evaluating whether or not an app is directed to children, including the contents of its advertising, which celebrities it partners and how the app itself works. But it was the phone calls from parents, an official clarified, that suggests TikTok broke the law, because those phone calls demonstrate that TikTok knew it had underage users.
“This settlement highlights the important of parental deletion rights,” Smith went on, adding that if a parent requests a deletion, “the company must do so. Period.” In some cases, when it did honor deletion requests, TikTok removed a child’s profile but not their video content.
It’s a notable decision for a few reasons. For one, TikTok is the first major social media company developed outside of the United States to begin garnering widespread scrutiny from regulators. Just last month, officials in India ruled that the company would have to start moderating content to prevent the spread of fake news. TikTok is owned by ByteDance, a Chinese company that is one of the most valuable startups in the world.
Is TikTok Safe for Teens?
So what’s next for TikTok? The company has already adopted a tactic employed by many social media networks that are popular with younger users, which is roll out a special children’s product with built-in limitations.
“The new environment for younger users does not permit the sharing of personal information,” reads TikTok’s statement. “It puts extensive limitations on content and user interaction. Both current and new TikTok users will be directed to the age-appropriate app experience, beginning today.”
If you’re trying to get your teen off TikTok, you should, starting now, have more luck getting swift action from the company’s moderation team. The company’s support page encourages parents to reach out to privacy@tiktok.com to report any inappropriate use.
It’s also a sign that it’s never too early to begin explaining the nuances of online identity to your child. If there’s a cool new app the teens are using, it’s increasingly important to assume that said app is probably spying on them.