Интернет-магазин DONTA

The UK’s new deepfake laws are already making the internet safer for women, but there’s still more to do

The UK's new deepfake laws are already making the internet safer for women, but there's still more to do

The most notorious deepfake porn website is now blocked to UK users.

Change is possible. Sometimes, it doesn’t feel like it is, particularly when working on violence against women and girls. But last week, the largest and most notorious deepfake sexual abuse website was no longer accessible in the UK.

The site blocked access following the government's announcement of plans to criminalise creating as well as sharing sexually explicit deepfakes without consent. Then, the nudification app, at the centre of shocking cases where teenage girls’ social media posts were turned into nudes by their male classmates, similarly restricted access.

This is a seismic moment in the fight against deepfake sexual abuse. It ends the easy access and normalisation of these abusive and harmful websites and apps, sending a clear message that using these services is deeply problematic.

But make no mistake: the decision to block access is a political act, the first step in what will no doubt be a long fight. The owners of these websites and apps are watching, waiting to see whether the UK government will follow through on its promises and whether other countries will follow suit.

While this is a welcome development, there’s a long way to go. We must now double down on our efforts to end this abuse.

The first concrete step is to ensure the new law is comprehensive, covering anyone who deliberately makes a sexually explicit deepfake without consent. Unfortunately, the government’s current proposal is limited, requiring proof the perpetrator is motivated by sexual gratification or causing distress. This won’t include men claiming artistic motives or saying they’re only having a laugh. Also, even if the perpetrators are trying to cause distress, actually proving it is difficult.

We should name this abuse for what it is – creating a digital forgery. It’s stealing someone’s likeness and sexual identity, creating a false representation of someone. It’s non-consensual conduct: neither the porn actor nor the woman whose image is imposed into the porn has consented to their images and identities being used in this way, and we don’t allow such false claims in other walks of life.

The current proposal will also create unjustifiable differences. If someone takes an image of you sleeping nude without your consent, it’ll be an offence, whatever their motives. But, if they take an image of you clothed, then use AI to make it nude, it will only be an offence if there is actual proof their purpose was causing distress or sexual gratification. Most women will recognise both acts as being intrusive and violating, regardless of the motives, and won’t accept them being treated differently. As the Revenge Porn Helpline says, motive requirements will make any new law difficult to evidence, charge and prosecute.

A spokesperson for the Ministry of Justice told GLAMOUR, “We are proud to be a world leader in cracking down on sexually explicit deepfakes and our new law will mean anyone who produces one for their own sexual gratification or to cause alarm, humiliation or distress will face a criminal record and an unlimited fine. Since we announced the new law two of the biggest deepfake sites have already blocked access to UK users in response to our plans.”

“Proving intent is a tried and tested part of everyday working across the criminal justice system – from police to prosecutors and the courts. Perpetrators will not be able to ‘get away’ with this degrading and misogynistic crime by denying their intentions.”

However, this new law is also about far more than holding individual perpetrators accountable. Dismantling the deepfake sexual abuse ecosystem has always been my primary justification for criminalisation.

A comprehensive criminal law makes clear that deepfake sexual abuse websites and apps have no lawful purpose, empowering us to impose greater obligations on the internet services that are facilitating and encouraging deepfake sexual abuse. That means challenging the payment providers that continue to prop up the deepfake financial ecosystem; saying to Google and Bing they can no longer highly rank deepfake porn sites and apps; making YouTube remove the videos telling people how to create sexually explicit deepfakes; removing adverts for nudify apps on mainstream social media such as X (formerly Twitter).

Taking this stand against internet platforms requires a comprehensive criminal law. We must not give these internet services a get-out clause by introducing a weak, partial law, enabling them to say they don’t have to act, as some forms of non-consensual creation remain lawful.

But it also requires the UK’s online safety regulator, Ofcom, to step up and prioritise tackling deepfake sexual abuse. We need real political pressure to encourage Ofcom to take this mantle on as, so far, the violence against women sector has expressed deep concerns that Ofcom’s approach is weak and will do little to disrupt existing patterns of online abuse against women and girls.

An Ofcom spokesperson told GLAMOUR, “Illegal deepfake material is deeply disturbing and damaging, and tackling violence against women and girls online is a priority for us. Under the Online Safety Act, firms will have to assess the risk of content like this circulating on their services, take steps to stop it appearing and act quickly to remove it when they become aware. Although the rules aren’t yet in force, we are encouraging companies to implement these measures and protect their users now.”

This is a moment of reckoning. The progress we’ve made is fragile, reliant on the whims of shady internet platforms. But we can harness the momentum. And we owe that to teenage girls and women everywhere who are now living with the pervasive threat of deepfake sexual abuse.

Creating sexually explicit deepfakes is not just like a sexual fantasy in someone’s head – it’s creating a digital file that can be shared online at any moment. Women know that at any time, anyone can make sexually explicit deepfakes of us without our consent, and there’s little we can do about it.

But we can change that. We can use our collective power to say this ends now.