Justice Department Sues TikTok, Alleging Massive Child-Privacy Violations



The Justice Department is officially ticked off at TikTok. The DOJ on Friday announced a lawsuit accusing the ByteDance-owned social platform of massive violations of the Children’s Online Privacy Protection Act (COPPA), and of a settlement the company accepted to end a similar case brought by the Federal Trade Commission in 2019.”This action is necessary to prevent the defendants, who are repeat offenders and operate on a massive scale, from collecting and using young children’s private information without any parental consent or control,” the announcement quotes Brian M. Boynton, principal deputy assistant attorney general and head of the Justice Department’s Civil Division.The 31-page complaint, filed on behalf of the FTC, outlines a variety of ways that TikTok either let under-13 users into the over-13 version of the service or ignored COPPA’s restrictions on gathering data from people under 13.TikTok let kids under 13 evade its age restrictions by restarting the account-opening process with a different birth date from the one they’d just entered, a workaround available through “at least late 2020.”Until “at least May 2022,” TikTok allowed underage users to create an account using a “social login” from other services that had looser age restrictions, such as Instagram and Google.The platform did not require all new accounts to go through an “age gate” process until 2022, which combined with the previous loopholes let in “millions of accounts” with no known age.TikTok gathered an illegal amount of data from accounts in its “Kids Mode” that included “app activity data, device information, mobile carrier information, and app information” and then used that to maintain profiles of those children.Up to “at least mid-2020,” TikTok shared this data with such third parties as Facebook–“in part to encourage existing Kids Mode users whose use had declined or ceased to use Kids Mode more frequently.”The company made it unreasonably difficult for parents to request the closure of their under-13-year-old children’s accounts and the deletion of that data.TikTok hired too few human moderators–”fewer than two dozen full-time,” the complaint alleges, who at times could spend “an average of only five to seven seconds” inspecting accounts flagged for review as potentially underage. Plus, it gave them inadequate resources for enforcing its rules covering under-13 users. Even when it deleted underage accounts, the service retained data about those users for too long—18 months in the case of “app activity log data”—and without documenting where and why that data was kept.Finally, the complaint says TikTok broke its 2019 settlement with the Federal Trade Commission over COPPA violations by TikTok’s predecessor Musical.ly by lying about its compliance and not keeping required records. For example, employees used a ByteDance messaging system called Feishu that let them permanently wipe messages, possibly including correspondence about that settlement, “until at least May 2023.”The complaint seeks “a permanent injunction to prevent future violations” of COPPA as well as civil penalties authorized under the law—as much as $51,744 for each violation, calculated per user and per day. Multiplied by the alleged millions of illegal underage accounts, that could yield a fine of far over $100 billion.(The COPPA scene in the season 4 episode “Terms of Servicde” of HBO’s Silicon Valley correctly captures how quickly this liability can escalate.)TikTok has been publicly quiet about the lawsuit. CEO Shou Zi Chew has not commented about it on his own account, the company’s @tiktokcomms account on X hasn’t tweeted about it, and its Newsroom has no announcements about it. But TikTok spokesperson Michael Hughes emailed a statement Saturday morning echoing what company reps have said to other publications. “We disagree with these allegations, many of which relate to past events and practices that are factually inaccurate or have been addressed,” he wrote. “We are proud of our efforts to protect children, and we will continue to update and improve the platform.”Hughes cites such child-safety features as a default daily screen-time limit of one hour for under-18-year-old users (which users can bypass) and the Family Pairing parental-oversight option it added in April 2020.

Recommended by Our Editors

COPPA, in effect since 2000 and amended in 2013, gives under-13 users sweeping online privacy protections, including a requirement for parental consent for the collection and use of their data. Adults, meanwhile, continue to have no general-purpose federal data-privacy law covering them online. Late last year, the FTC moved to kick off another update to COPPA to address the increased use of mobile devices and social networking.Many services deal with COPPA by banning under-13 users; kids deal with those bans by lying about their ages, age verification being an exceedingly difficult thing to do remotely.COPPA, however, ranks among TikTok’s lesser headaches in Washington, since it now faces a commercial ban in the US unless ByteDance sells the platform, as mandated by an April law spurred by concerns over the ability of the Chinese Communist Party to control that Beijing-based firm. The company, which remains one of the most popular social platforms in the US,  is challenging that law in court as “obviously unconstitutional” for its potential effects on First Amendment-protected speech.

Get Our Best Stories!
Sign up for What’s New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

We will be happy to hear your thoughts

Leave a reply

Shoparoon
Logo
Compare items
  • Total (0)
Compare
0
Shopping cart