UK Tech Companies and Child Safety Officials to Examine AI's Capability to Generate Abuse Content

Technology companies and child protection organizations will be granted authority to evaluate whether artificial intelligence systems can produce child exploitation material under recently introduced UK legislation.

Substantial Increase in AI-Generated Illegal Content

The declaration coincided with findings from a protection monitoring body showing that cases of AI-generated child sexual abuse material have more than doubled in the last twelve months, rising from 199 in 2024 to 426 in 2025.

Updated Legal Structure

Under the amendments, the government will allow approved AI developers and child protection organizations to examine AI systems – the underlying technology for conversational AI and visual AI tools – and verify they have sufficient protective measures to prevent them from creating depictions of child exploitation.

"Ultimately about stopping abuse before it happens," declared Kanishka Narayan, noting: "Specialists, under rigorous protocols, can now detect the risk in AI systems early."

Addressing Legal Challenges

The amendments have been introduced because it is illegal to create and own CSAM, meaning that AI developers and others cannot generate such images as part of a testing regime. Previously, authorities had to wait until AI-generated CSAM was published online before dealing with it.

This legislation is aimed at preventing that problem by helping to halt the production of those materials at their origin.

Legal Structure

The changes are being added by the authorities as modifications to the crime and policing bill, which is also implementing a ban on possessing, creating or sharing AI models designed to create exploitative content.

Practical Impact

This week, the official toured the London base of Childline and heard a simulated call to counsellors involving a report of AI-based exploitation. The call depicted a teenager seeking help after being blackmailed using a sexualised deepfake of themselves, created using AI.

"When I hear about young people facing extortion online, it is a cause of extreme frustration in me and justified anger amongst parents," he stated.

Concerning Statistics

A leading online safety organization reported that instances of AI-generated exploitation material – such as webpages that may contain multiple images – had significantly increased so far this year.

Cases of category A material – the most serious form of exploitation – increased from 2,621 images or videos to 3,086.

  • Girls were overwhelmingly victimized, accounting for 94% of prohibited AI images in 2025
  • Depictions of newborns to two-year-olds increased from five in 2024 to 92 in 2025

Industry Reaction

The legislative amendment could "constitute a vital step to guarantee AI tools are safe before they are released," commented the head of the internet monitoring organization.

"AI tools have enabled so survivors can be targeted repeatedly with just a simple actions, giving offenders the capability to create potentially limitless amounts of advanced, photorealistic exploitative content," she added. "Material which additionally exploits victims' trauma, and renders children, particularly girls, less safe on and off line."

Counseling Session Data

Childline also released information of counselling sessions where AI has been referenced. AI-related harms discussed in the conversations comprise:

  • Employing AI to rate body size, body and looks
  • AI assistants discouraging children from talking to safe adults about abuse
  • Facing harassment online with AI-generated material
  • Online extortion using AI-manipulated images

Between April and September this year, the helpline delivered 367 counselling interactions where AI, conversational AI and associated terms were discussed, four times as many as in the same period last year.

Half of the mentions of AI in the 2025 sessions were related to psychological wellbeing and wellness, including utilizing chatbots for assistance and AI therapeutic apps.

Charlotte Jordan
Charlotte Jordan

A seasoned real estate expert with over 15 years of experience in property investment and market analysis.