Skip to main content
The Keyword
EU lawmakers must act now to ensure the continued protection of children
["How is Gemini changing Maps?", "What is \"vibe design?\"", "How can I learn new AI skills?"]

As technology companies, we are deeply concerned about the breakdown in EU negotiations to secure the continued protection of minors against child sexual abuse. Allowing the legal basis of the ePrivacy derogation to expire on April 3, in place since 2021, is irresponsible. It must be extended.

Failure to act will reduce the legal clarity that has enabled companies for nearly 20 years to voluntarily detect and report known child sexual abuse material (CSAM) in interpersonal communication services, leaving children across Europe and around the world with fewer protections than they had before.

Voluntary detection of CSAM through hash matching is an established tool central to law enforcement investigations, as it helps identify ongoing child abuse and prevents dissemination of highly damaging and illegal content.

Long-standing industry-wide hash matching utilizes irreversible digital fingerprinting to identify known CSAM. By matching these unique hashes against a secure database of previously identified material, the system ensures high-precision detection while adhering to privacy principles.

Disrupting the detection of this egregious content reduces the tools available to industry to protect children and risks failing the victims of this abhorrent crime.

We urge lawmakers in Europe to swiftly agree on a way forward for voluntary CSAM detection in interpersonal communication services and enable the continuation of established tools to protect minors. Failure to do so would be irresponsible.

a text image card with the logos of five technology companies "Google" "LinkedIn" "Snapchat" "Microsoft" "TikTok"

Related stories

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe