International

Facebook and Instagram face fresh EU digital scrutiny over child safety measures

The European Union opened fresh investigations Thursday into Facebook and Instagram over suspicions that they’re failing to protect children online, in violation of the bloc’s strict digital regulations for social media platforms, AP reports.

It’s the latest round of scrutiny for parent company Meta Platforms under the 27-nation EU’s Digital Services Act, a sweeping set of regulations that took effect last year with the goal of cleaning up online platforms and protecting internet users.

The European Commission, the bloc’s executive arm, said it’s concerned that the algorithmic systems used by Facebook and Instagram to recommend content like videos and posts could “exploit the weaknesses and inexperience” of children and stimulate “addictive behaviour.” It’s worried that these systems could reinforce the so-called “rabbit hole” effect that leads users to increasingly disturbing content.

The commission is also looking into Meta’s use of age verification tools to prevent children from accessing Facebook or Instagram, or be shown inappropriate content. The platforms require users to be at least 13 years old to set up an account. It’s also looking into whether the company is complying with DSA rules requiring a high level of privacy, safety and security for minors.

“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” Meta said in a prepared statement. “This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”

They’re the latest DSA cases to focus on child protection under the DSA, which requires platforms to put in place stringent measures to protect minors. The commission opened two separate investigations earlier this year into TikTok over concerns about risks to kids.

Read more