进口食品连锁便利店专家团队...

Leading professional group in the network,security and blockchain sectors

Inside The Horrific World Of Deepfake Porn

VidaBresnahan5690563 2025.03.25 07:36 查看 : 10

A young woman has told how her whole world collapsed after her best friend stole her images and shared a deepfake porn video where she was the main character.

Jodie, not her real name, was left crying on the floor of her room after she received a tip-off that explicit images of her were being used on a porn site.

The poster - who later was revealed as her BBC music prodigy pal - said these images made him 'so horny' and offered to supply more photos of Jodie if others would create more deepfakes.

The vile attack is one of a rising tide of devastating online sexual violence, with experts saying that AI software is leaving young women defenceless.

Now specialists warn that shock figures uncovered by MailOnline revealing the extent of deepfake porn across the UK are just 'the tip of the iceberg' - and many others will unknowingly be victims of the foul practice. 

Children as young as 14 were found to have been targeted by the sick violation, as reports show 99.9 per cent of victims are women.  

And a private members bill has been introduced in the House of Lords to criminalise the creation and solicitation of the 'degrading and depraved' AI images.

Jodie battled with the police for them to take the perverted violation seriously, who have since apologised for how they handled her case.

'There was a total lack of compassion and consideration and care throughout the whole process,' Jodie said.

'I wanted a restraining order but they said I'm not the only job they have to do. It was such a horrific experience, it has led to real trauma.' 




This is the picture that helped Jodie solve who was creating deepfake porn of her - one she had taken with her best friend and not posted anywhere

Jodie tracked down who made the deepfake porn by analysing the photos the AI had used - one was a shot not shared on social media and was with her best friend Alex Woolf. 

'My whole world was crushing down on me when I realised who had done it,' Jodie said.

Woolf, who had a double first from the University of Cambridge, was someone she 'loved and trusted, he knew everything about me, our families were intertwined'.

A BBC Young Composer of the Year winner, he had even played the piano at Jodie's grandmother's funeral, who had 'joked that he was my future husband'. 

She said: 'You think people who do this are freaks and nerds but they're just people we know and live with and work with.' 

Jodie had previously confronted him over 'creepy' reddit posts on an anoymous account but Woolf just replied: 'You wish I was that obsessed with you'.

In 2021, he admitted to 15 charges of sending messages that were grossly offensive or of an indecent, obscene or menacing nature over a public electronic communications network.

'I broke down crying when the police called me, it broke my heart,' she said.

'Part of me really wished that it was all a big misunderstanding and there was another reason.

Your browser does not support iframes.




Woolf was described as a 'teddy bear' and 'everything you could want in a guy friend', often posting about women's rights on social media 




Read More


EXCLUSIVE

Teenage boys are asking teachers how to CHOKE girls during sex - degrading porn MUST be banned


'It tore my family apart, our brothers were best friends as well.' 

Her story comes as MailOnline can reveal that UK police forces hold 81 reports of deepfake porn in the last two years against people as young as 14-years-old.

And the number is far below the reality - as some police forces did not hold the data and others refused to give accurate figures, including the Met. 

Professor Clare McGlynn, who specialises in the legal regulation of pornography, sexual violence and online abuse, said the police reports 'represent the tip of the iceberg'. 

She says that thousands of people use the websites every day.

Conservative peer Baroness Owen, who is leading the charge with the Data (Use and Access) Bill says police reports show 'there are huge numbers of women having their consent violated without their knowledge'.

Jodie said this figure 'is not a sign that this abuse isn't happening, it's a sign of how broken the system is'.

Without sufficient laws to protect women online, 'victims are left in limbo, while those responsible face no meaningful consequences' and the 'true scale of the problem is hidden'. 




The musical prodigy had won BBC's Young Composer of the Year in 2012 and was frequently creating and performing his music

If people don't know that it is illegal, many will not bother going to the police, and those who do will often be told that there's nothing the cops can do without a suspect - 'that's exactly what happened to me,' Jodie said. 

When she first went to City of London police, a middle-aged, balding cop took no notes and she was later told by an officer that no crime had been committed.

If she wanted the videos removed, she would have to ask the porn website herself, they insisted.

'All I could think was - is he delusional to think that I want to go back and look at each photo of me like that and trawl through the website?', Jodie said. 

Detective Chief Superintendent, Head of Specialist Operations at City of London Police, Mandy Horsburgh has since apologised for 'any distress to the victim in our initial assessment and response' when contacted by MailOnline. 

They added: 'We have reviewed our procedures since this incident in 2021 and cases, such as this, are allocated to specialist officers for assessment and investigation where appropriate. We will take every opportunity to learn from this and will conduct a review of the circumstances of this case and make any further improvements where necessary.'

Jodie later tried again with the Metropolitan Police, this time specifically asking for a female police officer who she gave a file of evidence with details across several years to review.

She said: 'It took a lot of strength, it was really traumatising having to go back'.

Out of 20 women posted on the site, Jodie either knew or tracked down 18 of them -the final two still have no idea their images have been shared.

After the initial interview, Jodie said every phone call from the police came from a withheld number and often at random times, including late at night.




Woolf attended an awards ceremony in July 2021 - a month later he was sentenced to a 20-week prison sentence, suspended for two years




Read More

More than 250 British celebrities including Channel 4 newsreader Cathy Newman have been victims of deepfake pornography, investigation reveals


She was told she couldn't have a liaison officer because they believed there was no risk of threat or harm but then she had no one to contact when she saw the missed call and could only hope she would catch them the next time they rang.  

She felt that the handling process was not treated with the sensitivity it required if it might have been considered as a sexual offence.

The Metropolitan Police admitted in response to these claims that they 'haven't always got it right' and will 'continue to learn from our past mistakes and ensure victims are at the forefront of everything we do'. 

The failure of the police to recognise the horror of this sex crime was mirrored by the law, which had nothing to pin Woolf down with other than a misuse of a communications device.  

After insisting she needed to share her experience, Jodie cried when she read her impact statement to the court about how much this had destroyed her. 

So did her parents, as well as his.

'I'm convinced the magistrates were tipped by the emotion because without it I don't think they would have understood how awful it was'.

Jodie said Woolf showed no emotion. He has since told the BBC he is 'utterly ashamed' of his behaviour and he is 'deeply sorry' for his actions.

'I think about the suffering I caused every day, and have no doubt that I will continue to do so for the rest of my life,' he added.




UK police forces hold 81 reports of deepfake porn in the last two years against people as young as 14-years-old despite websites having millions of users

'There are no excuses for what I did, nor can I adequately explain why I acted on these impulses so despicably at that time.'

He denied any involvement in the harassment Jodie faced before this. 

Woolf was given a 20-week prison sentence, suspended for two years and ordered to pay each of his victims £100 in compensation, as well as £85 in court costs and a £128 victim surcharge.

With the money, Jodie bought an oil painting of a naked lady - 'it felt like I was taking my power back,' she explained.

Woolf was also ordered to complete a 40-day rehabilitation programme and 40 sessions with a sex offender programme as well as 180 hours of unpaid work.

But because deepfakes were not a crime at the time, Woolf is not on the sex offenders list and Jodie believes he has been able to continue working with his music and teaching children privately.  

Varinder Hayre, district crown prosecutor at the CPS, mawartoto said Woolf's behaviour was 'severely depraved and reprehensible' with a 'drastic impact on his victims'. 

While Jodie was lucky in the sense that she discovered the photos and was able to uncover the perpetrator, the majority of victims won't know deepfake porn will exist of them. 











Vicki Pattison discovered deepfake porn made of the Geordie Shore star online when researching for her documentary Vicky Pattison: My Deepfake Sex Tape





Read More

Shocked Vicky Pattison discovers real deepfake porn of herself online during filming


 Charities have reported hundreds of thousands of 'nudify' images are being made every month but Freedom of Information requests made by MailOnline have shown that at least 81 reports of deepfake porn were made to the police in the last two years.

Experts say most women are not even aware they are victims and others may have issues around reporting to police. Some police forces also did not hold the data and others refused to give accurate figures, including the Met.

The ones who did reply exposed a bleak rift that shows most victims don't know they will have had perverted images made of them.

South Wales experienced the worst wave of deepfake porn abuse, with Heddlu Gwent Police revealing there had been 28 reports made, the highest number across police forces in Britain.

In 21 of these instances, women were the victim. The age of offenders ranged from 12 to 50, with the youngest victim at 14. 






There were a variety of outcomes to these reports including charges, evidential difficulties, and no suspect identified. 

In West Yorkshire, there were 17 reports, five of which were made about children under the age of 15. In five cases, a suspect was identified.

In Lancashire, there were seven reports and four in Leicestershire.

In Scotland, a guardian of a 15-year-old girl had reported a digitally altered image of her daughter that was on another person's phone and had been shown to others.    

In another instance, a guardian of a 21-year-old woman said they were aware of online content featuring her image which had been digitally altered.

In the last report made to Scottish police during the last two years, a man had recognised digitally altered images of a woman known to him online.

In Surrey, there were four reports in the last two years in regards to the taking, making or distribution of indecent photographs or pseudo photographs of children, malicious communications and harassment.

None led to charges.

Despite these disconcertingly low figures, last year it was revealed that at least 50 Telegram bots were built to 'remove clothes' or create sexual scenarios from supplied images, which was shown to have more than four million 'monthly users' combined.








Telegram was host to four million 'monthly users' that would use nudify bots although these are believed to only represent a portion of what is available on the app




Read More

More than a quarter of teenagers have seen a sexualised deepfake of a celebrity, friend or teacher


The reported bots were English-language bots and so were believed to only represent a small portion of the deepfake scene on Telegram.

The year before that, a popular deepfake porn website hosted on Google had 17million hits a month and charged users $5 to download nude videos of celebrities.

And Revenge Porn Helpline have reported that they have seen a 400 per cent increase in cases reported to them since 2017 about deepfake abuse. 

Professor Clare McGlynn said the police reports 'represent the tip of the iceberg'.

She said: 'We know that thousands are using nudify apps and visiting deepfake websites everyday.

'Many women do not know they have been deepfaked, and if they do, many do not want to report to the police as they fear not being taken seriously.

'They may not know who the perpetrator is. They perhaps also fear what the perpetrator will do if they report.








Professor Clare McGlynn said that the number of reports made to the police are just the 'tip of the iceberg'

Professor McGlynn, who has also been working closely with writing new legislation, added that these reports nonetheless show 'that for some women and girls, when they find out, they do want to seek justice through the criminal justice system.' 

Sophie Compton, director of Another Body and co-founder of #MyImageMyChoice, similarly said these reports were a 'fraction' of the abuse that is taking place.

As well as the 600,000 photos processed of women on one 'nudify' app they monitored in the first three weeks of it's launch, they said there were 'thriving forums and online marketplaces targeting British women and celebrities'.

'This is why alongside adequate criminal legislation that sends a clear message that this practice is not acceptable, we urge the government to ensure we get regulation that enables us to shut down sites dedicated to this practice.'

The Director of End Violence Against Women Andrea Simon added that 'the prevalence of this abuse is hard to measure'.

She explained that 'most victims are unaware of the abuse, may feel worry or shame about coming forward or unsure of where or how to report it, or may be dismissed by the police when they do report'.

For those who do discover these images, she said it can have 'a traumatic and long-lasting impact on victims, their mental health, careers and their relationships.'











A young woman shared on her Facebook how 'AI has gotten so so scary' when she was sent a deepfake pornographic image of herself that was 'extremely graphic and has very much upset me and made me feel violated'




Read More

Two private schools face police probe over claims pupils used AI to 'create deepfake porn images'


 'There are very few protections in place to prevent this abuse and hold perpetrators accountable.'

A report in 2019 showed that 96 per cent of all deepfakes found online are of non-consensual sexually explicit images, and 99.9 per cent of those are depicting women. 

Ms Simon believed that the threat of deepfake abuse forces women to 'self-censor' and has a 'chilling effect on our freedom of expression. Women and girls have the right to be safe online.'

'I hate to think where else my pictures have ended up on the internet,' Jodie shared. 'If you're a normal person why would you have any idea?' 

But Jodie believes these deepfakes are just 'scratching the surface of what technology can do. That's why I want a robust law to protect us.' 

Lady Owen, who became the youngest recipient of a life peerage in 2023, said: 'It is vital that the Government takes a strong position in standing up to those who abuse women in this appalling way'. 








Lady Owen shared that she was 'delighted' with the House voting for the amendment, adding that 'a woman's consent should be the only requirement' 

 The House recently voted to increase the maximum sentence of deepfake abuse from an unlimited fine to imprisonment in the third reading of the Data (Use and Access) Bill.

The importance of having jail time would deter abusers from thinking they are 'untouchable' and would 'show how seriously, as a society, we take this form of digital violence against women', Lady Owen believed.

As well as increasing the sentence, they overwhelmingly supported calls to remove the 'reasonable defence' excuse.

The controversial amendment would have required the victim to prove the perpetrator had intended to cause them harm, distress or humiliation, or to gain sexual gratification.

Campaigners said that this would have protected abusers' rights to freedom of expression and failed to properly protect women from their image being used against them. 

Lady Owen shared that she was 'delighted' with the House voting for the change, adding that 'a woman's consent should be the only requirement'.

The Government were forced to accept another amendment that would criminalise the solicitation of sexually explicit images of someone without their consent, blocking a loophole where people in the UK could ask someone abroad to make a deepfake image. 

The Data (Use and Access) Bill is set to head to the House of Commons for consideration from MPs.








'We are at the precipice of a new age of extreme misogyny', Baroness Owen said 




Read More

Police couldn't find the person who created deepfake porn of me  so I tracked him down myself


But experts have argued that although a law is crucial to protect women, responsibility also lies with Big Tech. 

Ms Simon said these corporations will 'only act if strong measures are put in place requiring them to address and prevent it' and the government needed to 'ensure tech companies are held accountable'.

Professor McGlynn added that nudify apps are still easily accessible online and Google has 'facilitated this by highly ranking search terms relating to nudify and deepfaking'.

She also said that 'Instagram continues to profit from deepfake sexual abuse by advertising nudify apps'. 

The National Police Chiefs' Council acknowledged that deepfakes 'disproportionately affect women and girls' and are 'deeply pervasive and can have a traumatic impact on victims'.

They wanted victims to know that these are 'serious crimes' and should feel 'confident' reporting to the police.

They admitted that 'tech-enabled crimes continue to evolve at pace, and policing alone cannot keep people safe online'. 

The council believed that 'companies are held to account for the role they should play in removing harmful content from their platforms'.

Greater education on consent and healthy relationships was vital 'to stop this harmful behaviour from developing in the first place', they added.


class=BBCGraphicsReddit