NY Poised to Ban Kids From Accessing Addictive Algorithmic Social Media Feeds

New York legislators on Friday passed two bills aimed at making children safer online. The SAFE (Stop Addictive Feeds Exploration) for Kids Act restricts access to addictive algorithmic feeds, and the Child Data Protection Act is aimed at keeping personal data safe.The passage of both bills could potentially change how children in New York use social media and force social media companies to offer different versions of their apps for children in the state.Social media platforms such as TikTok and Instagram, for instance, will no longer be able to serve content to users under the age of 18 based on their recommendation algorithms. Instead, they will have to provide a reverse-chronological feed for younger users.Violations could result in some pretty hefty fines. According to the bill, a company will have 30 days to correct the issue before being hit with a fine of up to $5,000 per kid under the age of 18.A previous version of the bill included a provision that would ban social media platforms from sending notifications to children between the hours of midnight and 6 a.m., but that provision was removed on Monday before the bill was voted on, NBC News reports.The New York Child Data Protection Act, meanwhile, bans sites from collecting, using, sharing or selling personal data of anyone under the age of 18 without consent. At the federal level, the existing Children’s Online Privacy Protection Act (COPPA) requires parental consent for kids under 13, which is why most social networks don’t allow people to sign up if they’re under the age of 13. There’s an effort at the FTC right now to update COPPA to address the increased use of mobile devices and social networking.New York Governor Kathy Hochul appears poised to sign the bills.

Recommended by Our Editors

“New York is leading the nation to protect our kids from addictive social media feeds and shield their personal data from predatory companies,” Gov. Hochul said in a Friday statement. “Together, we’ve taken a historic step forward in our efforts to address the youth mental health crisis and create a safer digital environment for young people.”A similar bill passed the California state Senate in May. Across the pond, UK regulators are also calling on social media sites to “tame toxic algorithms” and prevent kids from seeing content related to suicide, self-harm, eating disorders, and pornography.Social media companies, meanwhile, have dabbled in kid-centric versions of their own apps. Instagram scuttled that effort in 2021 amid concern that the Meta-owned site would not be able to keep kids and their data secure. Facebook’s Messenger Kids and YouTube Kids faced similar backlash, though both are still around.

Is your iPhone secure? How to double-check your privacy settings

Get Our Best Stories!
Sign up for What’s New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

We will be happy to hear your thoughts

Leave a reply

Compare items
  • Total (0)
Shopping cart