
-
74
Views
-
0
Comments
-
0
Like
-
Bookmark
EU child safety gap widens as CSAM detection framework expires
A political deadlock between EU institutions has led to the expiration of the legal basis for platforms to proactively detect child sexual abuse material (CSAM).
The European Union faces a significant 'child safety gap' following the expiration of the temporary legal framework that allowed digital platforms to proactively detect and report child sexual abuse material (CSAM). The expiry follows months of political deadlock between the European Commission, the Council, and the Parliament over the balance between user privacy and child protection.
Tech companies now operate under extreme legal uncertainty, with many expected to halt voluntary scanning activities to avoid violating the GDPR. This cessation is predicted to cause a massive drop in reporting to global agencies like NCMEC. Industry leaders have characterized the lapse as 'irresponsible,' noting that the extraterritorial reach of EU privacy laws may force platforms to limit detection efforts for EU citizens globally, not just within the continent's borders.
The primary friction point remains the role of Artificial Intelligence. While the Commission and Council advocate for a broad mandate to detect new and AI-generated abuse material, the Parliament has pushed for a narrower scope limited to 'known' abuse images to protect encryption and privacy. The urgency is underscored by a 1,300% increase in AI-generated CSAM reports over the last two years. Negotiations for a permanent framework continue, with a crucial European Parliament vote on a new mandate scheduled for June 2026.
Key takeaways
- The temporary legal derogation allowing platforms to scan for child sexual abuse material (CSAM) expired on April 3, 2026.
- A similar legal vacuum in 2020 resulted in a 58% decrease in CSAM reports to the National Center for Missing and Exploited Children (NCMEC) within 18 weeks.
- The deadlock centers on a dispute between the European Parliament's focus on privacy and the European Commission's push for broader detection including AI-generated material.
- Reports of AI-generated abuse material surged from 4,700 in 2023 to 67,000 in 2024, complicating legislative negotiations.
- The European Parliament voted against a temporary extension in late March 2026, despite pressure from law enforcement and child rights groups.
Sources
- freshfields.comhttps://riskandcompliance.freshfields.com/post/102mopa/an-uncertain-path-forward-the-eprivacy-derogation-and-child-safety-detection
- europa.euhttps://www.europarl.europa.eu/RegData/etudes/BRIE/2022/738224/EPRS_BRI(2022)738224_EN.pdf
- europa.euhttps://www.europarl.europa.eu/topics/en/article/20231116STO11629/how-the-eu-is-fighting-child-sexual-abuse-online
- iapp.orghttps://iapp.org/news/a/territorial-scope-of-the-gdpr-from-a-us-perspective
- europa.euhttps://www.europarl.europa.eu/news/en/press-room/20260306IPR37531/child-sexual-abuse-online-support-for-extending-rules-until-august-2027
- therecord.mediahttps://therecord.media/eu-parliament-rejects-csam-scanning-extension
- eu.cihttps://eu.ci/eu-chat-control-regulation/

