Connect with us

News

EU Softens Plan to Make Big Tech Scan for Child Abuse Content, Shifts Enforcement to Member States

EU countries have agreed on a softer stance toward proposed child-protection legislation, stepping back from rules that would compel Big Tech to detect and remove online child sexual abuse content. The revised position prioritizes privacy concerns, leaves enforcement to national authorities, and sets the stage for final negotiations with EU lawmakers.

Published

on

WhatsApp Channel Join Now
Telegram Group Join Now
Instagram Join Now

European Union member states have reached a shared position on proposed online child-protection legislation—but without requiring global tech companies to proactively detect and remove child sexual abuse material.

The stance, announced Wednesday by the European Council, marks a win for U.S. tech giants such as Alphabet’s Google and Meta, as well as for privacy advocates who argued the earlier proposal risked undermining digital rights. It also reflects a broader pushback against stringent tech regulation, particularly from the United States.

The Council’s approach is notably less strict than the European Parliament’s 2023 position, which would have obligated messaging platforms, app stores, and internet service providers to detect, report, and remove both existing and newly discovered abuse imagery, as well as instances of grooming.

Originally drafted in 2022, the legislation aims to improve coordination across the EU’s 27 member states, as online child exploitation increasingly transcends national borders.

The next phase requires EU countries to negotiate the final details with the European Parliament before the legislation can become law.

According to the Council, online service providers must assess the risk of their platforms being used to share child sexual abuse material or facilitate contact between offenders and children. They must also take preventive steps—but enforcement will be left to national governments.

“Member states will designate national authorities responsible for evaluating these risk assessments and mitigation measures, with the option to require providers to implement them,” the statement said. “In cases of non-compliance, providers could face penalty payments.”

The draft law also allows companies to continue voluntarily scanning content for child abuse material after April next year, when the current exemption from EU privacy rules is set to expire. Additionally, it will create an EU Centre on Child Sexual Abuse to support national authorities and provide resources for victims.

“Every year, millions of files are shared that depict the sexual abuse of children,” said Peter Hummelgaard, Denmark’s justice minister. “Behind each image and video is a child who has suffered unimaginable harm. This is completely unacceptable.” He welcomed the agreement among member states as a key procedural step toward advancing the legislation.

Separately on Wednesday, the European Parliament urged the EU to set minimum age requirements for children to access social media platforms, citing rising mental health issues linked to excessive online exposure. The recommendation is non-binding.

Globally, momentum is building for stricter age-based rules. Australia is preparing to introduce the world’s first social media ban for children under 16, with Denmark and Malaysia also announcing plans for similar restrictions.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *