Hero background

Child Exploitation and Payments: Lessons From Europol’s 2025 Report

Stolen data and AI are being used to exploit children. Europol has suggested getting rid of End-to-End Encryption. Is that the solution?

7 min read

Europol’s 2025 Internet Organised Crime Threat Assessment doesn’t mince words. Criminals have industrialised their operations. They use:

  • AI-generated grooming scripts and deepfake voices

  • Crime-as-a-service platforms with customer support

  • Initial access brokers selling your credentials

  • End-to-end encrypted channels to hide their tracks

And all of this runs on one thing: stolen data.

Fraud. Ransomware. Sextortion. Grooming. Human trafficking. It’s not about “breaking in” — it’s about logging in, with AI doing the heavy lifting.

For payments professionals, the spotlight tends to fall on impersonation fraud.

But Europol’s report makes it clear: stolen data and AI are also being used to exploit children. Financially and sexually.

Should we be doing more? Yes.

Are we? Not currently.

Here’s why that is a mistake.

Payments are the rail, whether you like it or not

In 2023, NCMEC received over 100 million files of suspected CSAM — in the US alone. Every one of those files represents a child being harmed, and every transaction behind them passed through a financial system.

Recent data shows this problem is accelerating rapidly. The Internet Watch Foundation reports that AI-generated CSAM reports have risen 380%, with 245 confirmed reports in 2024 compared with 51 in 2023.

Abuse is livestreamed for paying customers. Modified content is sold to collectors. Generative AI uses stolen personal information to create images, videos, and chat scripts that enable manipulation and coercion.

Much of this content is bought, sold, and paid for through the financial system. On the bank’s watch.

Criminals are building synthetic child abuse economies — and the money is moving through legitimate rails.

Banks and fintechs are facilitating transactions that fund child abuse — even if they don’t host the content.

For fraud, banks face growing scrutiny. In the UK, the new Failure to Prevent Fraud offence comes into force on 1 September 2025. Businesses with more than £36 million in revenue or 250 employees will be liable for fraud under their watch.

Nick Murphy, Director of the Serious Fraud Office, put it bluntly:

“I’m very, very keen to prosecute someone for that offence. We can’t sit with the statute books gathering dust; someone needs to feel the bite.”

This follows the UK’s APP fraud rules, which already hold banks responsible for consumer scam losses. Accountability is increasing. So is pressure to act.

So far, regulatory focus has been on fraud. But what if the same logic — accountability for enabling harm — were applied to child sexual abuse? What about human trafficking? Sextortion?

Europol’s Proposal: Lawful access by design

Europol’s report suggested action in the form of lawful access mechanisms — that is, building systems that allow law enforcement to access critical data in cases of serious crime, even on encrypted platforms — digital literacy, and a more coordinated European response.

To combat fraud, yes, but also to protect children.

The industry response was profound.

“You can’t defend what you don’t understand.”

This quote, taken directly from Europol’s report, identifies where all meaningful intervention begins – from scams to human trafficking. No awareness, no defence.

But the industry kicked back. Why? Europol’s suggestion of “lawful access by design.”

This, in other words, means reconsidering end-to-end encryption (E2EE) in the context of serious crime.

Privacy concerns are valid. Weakening encryption is neither easy nor desirable. But the current trajectory demands honest evaluation.

Here’s why this matters: NCMEC warns that up to 70% of child abuse referrals currently made by Meta could disappear once full end-to-end encryption is in place — not because the abuse stops, but because the platforms can no longer see it.

Law enforcement agencies warn that as many as 85% to 92% of child abuse reports disappear with Meta’s encryption.

Meta is no longer able to see messages from online groomers that contain child sexual abuse material, and therefore they can’t refer it to the police.

The content doesn’t disappear. The evidence does.

When privacy advocates argue that encryption protects against “bigger crimes,” we must ask: what crimes are bigger than the systematic sexual exploitation of children?

What if payments are the last line of defence?

This creates an unprecedented challenge for the payments industry.

As governments hold financial institutions increasingly accountable for criminal activity, CSAM represents the highest-stakes regulatory and reputational exposure possible.

Millions of reports that drive law enforcement responses against child sexual abuse and exploitation cease to exist as platforms implement comprehensive encryption, leaving payment systems as the primary remaining detection point.

Financial institutions risk becoming unwitting facilitators of child exploitation economies, with limited visibility into the criminal networks their systems enable.

The technology challenge

The encryption debate is often reduced to privacy versus security. But for the payments industry, the stakes are different. Monitoring systems weren’t designed to detect criminal economies hidden behind encrypted messaging — yet still flowing through legitimate rails.

If end-to-end encryption is implemented without an exception for detecting illegal activity like child sexual abuse imagery, millions of incidents of abuse will remain hidden and young victims will have no help or protection.

The payment industry cannot solve the encryption debate alone, but it can take immediate action:

  • We must develop sophisticated pattern recognition specifically designed to identify financial flows associated with child exploitation, even when the communications layer is encrypted.

  • That means partnering with technology companies and law enforcement to develop solutions that protect both privacy and children, rather than treating them as mutually exclusive.

  • As an industry, we have to engage constructively in lawful access discussions, bringing payment industry expertise to complex technical and policy challenges.

  • We should recognize that financial institutions lag in action partly because detection is hard, the regulatory incentive isn’t strong, and reputational risk can discourage intervention.

  • And anticipate that child protection will receive the same regulatory focus as fraud prevention, with similar liability frameworks.

The moral imperative

Behind every encrypted message that shields a child exploiter is a payment that flows through systems payments professionals monitor and control. The financial trail remains visible even when communications disappear.

Encryption protects privacy. But it also protects those who exploit children. It locks away evidence. It hides criminal economies.

Payment professionals are uniquely positioned to address this challenge. Unlike social media platforms, financial institutions have established compliance frameworks, law enforcement relationships, and regulatory accountability.

The question isn’t whether we support privacy or child protection. It’s whether we’ll develop solutions that defend both.

Europol is sounding the alarm. The payment industry has the tools, relationships, and responsibility to respond. We must act — not just with better technology, but with the urgency and responsibility this crisis demands.

Share this post

Let's work together