Deepfake porn, a BBC composer and the humiliated friend who fought to have him arrested

When a childhood friend of Alex Woolf discovered he was to blame for fake images of her online, police told her it wasn’t a crime

Alex Woolf, a former winner of the BBC Young Composer of the Year competition, arranged for his friend's face to be transposed on to the body of a porn actor
Alex Woolf, a former winner of the BBC Young Composer of the Year competition, arranged for his friend's face to be transposed on to the body of a porn actor Credit: Geoff Pugh for the Telegraph

It was the early morning of March 10 2021 when Jodie*, then 23, opened an anonymous email. A link directed her to a porn site, where she found countless naked images and videos – of herself. “I was completely freaking out, I thought my whole life was over. I broke down and was screaming and crying, I practically blacked out,” she recalls.

Her housemate heard Jodie crying and came into her room. In shock, Jodie showed her the images and tried to explain that they featured her face, but not her body. She had been “deepfaked,” becoming the latest victim of a rapidly developing technology that allows strangers to stitch anyone’s face onto the bodies of performers in hardcore porn pictures and videos.

It turned out Jodie’s childhood friend, Alex Woolf, had taken clothed pictures of her and posted them on an online forum so that other users, skilled in deepfake tech, could then transpose her likeness onto the pornographic images.

Woolf’s post read: “She makes me really horny, have never done this before, would love to see her faked.” Replies then flooded in with sexualised comments and the images he asked for were indeed created. Horrified by the betrayal of a trusted friend, Jodie says: “When the police confirmed it was Alex who had done it, I cried so hard all the blood vessels in my eyes burst.” 

The past few years have seen an exponential rise in sexually explicit deepfake content. In the first three quarters of 2023, research found 143,733 new deepfake porn videos uploaded to the internet compared to just one identified by researchers in 2016.

But while the creation of deepfake porn has exploded – just this week, it was revealed that several high-profile female politicians have had their image used on pornography sites – UK law has been slow in catching up. It was only in January that the non-consensual distribution of intimate deepfakes was criminalised.

Subsequently the government announced plans for a new law that would ban the creation of sexually explicit deepfakes, though not soliciting others to create the fakes, as Woolf did. It is not certain if this legislation will be taken up by a new government.

‘Hundreds of people were humiliating women in any way they could’

At the moment, only the creation of sexually explicit deepfakes of under-18s is an offence. Last month, it emerged that police had launched investigations at two private schools, following claims that pornographic images of students at a girls’ school had been created in a boys’ school using innocent images sourced from social media. This week it was reported that, in a separate case, two private schoolboys had been reprimanded by police for deepfaking their female classmates.

“The attempts to try and address image-based abuse so far have been piecemeal and inadequate,” says Rebecca Hitchen, head of policy & campaigns at the End Violence Against Women Coalition (EVAW). 

On top of the understandable social mortification, Jodie felt doubly victimised by this gap in the law – battling with police who, she felt, did not provide enough support and also with the sites that allowed pictures of her to be shared online without her consent. 

Recalling the hoards of pornographic images freely available on the site, Jodie says, “Alex was playing out these sexual fantasies he couldn’t have in real life and hundreds of people online were just joining in, humiliating women in any way they could.” 

“I was the victim of what felt like a targeted hate crime, I was very scared to leave my house, and there was a complete lack of awareness from the police.” 

After initially approaching the City of London police, and being turned away because they said that a crime had not been committed, Jodie went to the Met police, who launched an investigation. 

With the help of 14 other victims from whom Woolf also took images (uploading them to social media with grossly offensive messages – only Jodie was deepfaked), Jodie compiled and delivered a 60-page document of evidence to the Met Police. She felt the police “had everything handed to them on a plate”.

Even so, Jodie says Woolf’s victims were not provided with a liaison officer, making it incredibly difficult to get updates on the case. “We were at the beck and call of the Met getting in touch with us,” she says. The investigation took a serious toll on Jodie’s mental health. Not feeling safe, as Woolf knew where she lived, she ended up quitting her job and moving house. 

In the end it was not creating pornographic images of Jodie that proved Woolf’s legal undoing, but the offensive messages he had posted publicly about the other – clothed  – victims. The former winner of the BBC Young Composer of the Year competition pleaded guilty to 15 charges of sending messages that were grossly offensive or of an indecent, obscene or menacing nature over a public electronic communications network and received a 20-week suspended prison sentence and was ordered to pay each of his victims £100 in compensation. 

Despite his crimes not being classified as sexually motivated, Woolf was given rehabilitative therapy, while Jodie and the other 14 victims were not provided with any state-funded support. 

Jodie’s trust in the police is now broken: “The police are just not set up to deal with crimes of a digital nature, and they are certainly not equipped for the volume of deepfakes that we’re going to be seeing in years to come.” 

Prof Clare McGlynn, an expert in image-based sexual violence at Durham University, says Jodie’s case shows that “criminal law doesn’t understand or deal with the harms and abuse perpetrated by technology against women and girls.”

“With constant technological development, there are always new ways to harm. We don’t even know what we’ll be talking about in five years, but I can almost guarantee it won’t be covered.” 

‘Children as young as 13 treat it as normal’

EVAW is calling for specialist victim support that would prioritise the wellbeing of victims, and ensure they had support throughout a police investigation. It also highlights the value of creating civil laws that would allow victims to approach the courts directly for compensation, and a court order for the removal of non-consensual content from the internet. 

“Billion-dollar tech companies have been able to hide the part they play in this abuse and evade responsibility while profiting significantly from it because there is no legal requirement or incentive for them to take action,” says Hitchen. 

McGlynn adds: “If we have a comprehensive law, we can then strengthen the hand of Ofcom to be able to say to Google: ‘You really can’t be returning these things at the top of a search’ and to X, TikTok and Instagram: ‘You cannot be advertising things like nudify apps.’”

Nudify apps are AI-powered tools that take clothed images, uploaded by the user, and undress them. In May, Google implemented a policy to remove the “promotion of synthetic sexual content,” from its shopping ads. It is also committed to deranking deepfake creation sites. However, terms like “deepfake porn” and “nudify” continue to return links to apps and sites to create such content above any other results.

A spokesperson for Google was keen to push back on the idea that the company was not being proactive on this issue: “We’ve been actively developing new protections in Search to help people affected by this content, building on our existing policies. We’re continuing to decrease the visibility of involuntary synthetic pornography in Search and develop more safeguards as this space evolves.” 

Another cornerstone of EVAW’s campaign is education surrounding image-based abuse in schools. McGlynn warns that the easy access to deepfake technology poses, “a real threat in our schools, as young people have effectively been groomed by a culture which is not taking nudification and deepfakes seriously.” 

“If children as young as 13 are seeing adverts for nudify apps, then of course they treat it as normal”, she adds. “They can type it into Google and there it is, along with YouTube providing guides on how to do it.”

Jodie is determined to see the right protections against deepfakes being written into law, whether that’s amendments to current legislation or a stand-alone law on image-based sexual violence. She wants police to be granted more power to act and funding to be able to offer the right victim support. Meanwhile, she feels powerful tech firms need to be forced to take responsibility for hosting and promoting websites and apps that can cause such harm.

“It’s a crime without barriers at this point and it’s so easy; you can do it at the click of a button,” explains Jodie. “That’s why it’s so scary. These images – they live online and they will live there forever and you never know who could have them. Something has to change.”

*Names have been changed

License this content