We haven't been able to take payment
You must update your payment details via My Account or by clicking update payment details to keep your subscription.
Act now to keep your subscription
We've tried to contact you several times as we haven't been able to take payment. You must update your payment details via My Account or by clicking update payment details to keep your subscription.
Your subscription is due to terminate
We've tried to contact you several times as we haven't been able to take payment. You must update your payment details via My Account, otherwise your subscription will terminate.

Private schoolboys made indecent images of female pupils with AI

The case is the second criminal inquiry in recent weeks into deepfake pornographic pictures at independent schools
It is feared parents, schools and the police are not equipped to deal with the rapid spread of AI image technology
It is feared parents, schools and the police are not equipped to deal with the rapid spread of AI image technology
GETTY

Two private schoolboys have been reprimanded by police for making deepfake pornographic images of their female classmates.

The case is the second criminal inquiry into the creation of AI-generated abuse images at independent schools to have emerged in recent weeks.

The Times reported last month that two other private schools — a boys’ school and a girls’ school in the same area of the country — are at the centre of a separate police investigation into the spread of abusive images.

Officers are investigating claims that images had been taken from social media accounts of pupils at the girls’ school and manipulated to create nude and pornographic material.

It is understood that girls at a third private school may also have had their images turned into deepfakes. The investigation is continuing and no arrests have been made.

Advertisement

The cases coincide with rising concerns that parents, schools and police are ill-equipped to deal with the rapid spread of AI image technology and “nudifying” apps that make it easy to create such images.

Rani Govender, of the NSPCC, warned that sexualised deepfakes are “child abuse images which are being created and shared with ease”.

She told The Times: “We know this abuse is having a particularly devastating impact on girls, who often feel victimised and belittled in spaces they should feel safe.

“The rise of AI abuse images is being enabled by tech firms who have not designed child safety into AI products, and social media companies who fail to stop images from spreading rapidly across their platforms.”

The charity called for the next prime minister to demand “tough action from tech firms and embed child protection into any future AI safety strategies”.

Advertisement

What the new sex education guidelines mean for schools and parents

Police were alerted to the most recent case, involving a co-educational day school, in May.

A spokesman for the police force involved said: “Officers received a report that indecent pseudo-photographs of teenage girls had been created.

“Officers have investigated and spoken to two teenage boys in connection. The boys both admitted their involvement and have been issued with community resolutions. The images have since been deleted.”

The force said it used community resolutions for young people “especially when it is their first offence and they haven’t had any previous contact with the police”. It added: “The aim of issuing a community resolution is to prevent reoffending and to ensure offenders truly understand what they did was wrong and why.”

Advertisement

The school attended by the boys and the girls who were targeted said it “experienced an isolated incident related to the creation of indecent pseudo-photographs. A robust and detailed investigation was undertaken with the small number of pupils involved and their families, including appropriate interaction with the police and other relevant external agencies.”

The NSPCC and the Internet Watch Foundation have created the Report Remove tool which helps to erase sexually explicit images and videos online.