Blog | News | Press Release

5 February 2024

New laws criminalise the sharing of intimate deepfakes without consent

Recent reports of sexually explicit ‘deepfake’ images of Taylor Swift circulating in the news last week have highlighted the reality that women and children are disproportionally targeted by generative AI. We must remember that behind this technology is often an abuser intending to cause harm and distress to their victims.

From 31st January 2024, the Online Safety Act has made the sharing of AI-generated intimate images without consent illegal. The Act has also brought in further changes around sharing and threatening to share intimate images without consent. Read more below to see more about changes the Act has made and what your rights now are.

What are ‘deepfakes?

‘Deepfakes’ is the term given to any content that has been digitally manipulated to falsely depict an individual using AI technology¹. It can include altering audio to imply an individual has said something they haven’t, and altering videos and images to show someone doing something they haven’t.

While there has been lots of public debate around fears of AI-generated images manipulating politics and altering public opinion, sadly the most commonly shared deepfake images on the internet are non-consensual sexual depictions of women Some of these images are being created and shared by men known to the survivors of this crime.

It hit the headlines last week as deepfake explicit images of Taylor Swift went viral on social media and were seen millions of times before being pulled by social media platforms. The widespread horror and outrage at this high-profile case shone a much-needed light on this alarmingly increasing trend.

Even when the images themselves are not real, the harm they can cause is. Not only can they have a devastating effect on survivor’s mental wellbeing and physical safety, but they can also be used to blackmail, humiliate, damage relationships, and create employability issues.

With the rise of generative AI, it has become easier than ever to create images and videos of women and girls with the intent to blackmail and abuse, underscoring the importance of firm action from legislators and social media platforms.

What are your rights?

Sharing intimate images of someone without their consent and with intent to cause distress has been illegal since 2015² and since 2021 it has also been illegal to threaten to share such images thanks to Refuge’s Naked Threat campaign.

As a result of sustained campaigning by Refuge and our allies, the new Online Safety Act goes even further in enhancing protections for women in online spaces, with new rules meaning that sharing intimate AI-generated images of someone without consent is now also illegal.

In addition, if an intimate image has intentionally been shared without consent, you no longer need to prove the perpetrator’s motivation to cause distress, rather that the perpetrator intended the survivor to fear the threat would be carried out.

There are also more serious consequences for sharing intimate images based on the perpetrator’s intent, whether that is to alarm, distress or humiliate the person pictured, or obtain sexual gratification.

Sending unsolicited sexual images, also known as ‘cyberflashing’ has been criminalised, although this depends on proof of the perpetrator’s intent to cause harm or gain sexual gratification, which can be difficult to prove.

You can read more about the recent changes here.

What does this mean for me? 

What this means is that as of now, committing any of the above is a prosecutable offence.

If you have been affected by deep fake technology or technology-facilitated abuse, you can seek support and contact the National Domestic Abuse Helpline on 0808 2000 247. To find resources on how to secure your tech visit

What’s next?

Since 2020, we’ve been campaigning for social media platforms to take greater responsibility for acting on harmful content like being shared. We need action from social media platforms, and this is why regulation is key.

At Refuge, we will be feeding into Ofcom’s implementation of the Online Safety Act and making recommendations to ensure the voices of survivors are heard, and that there are processes in place to safeguard survivors online.

We hope these changes can be an additional step in helping survivors of intimate image abuse seek justice.


¹Lucas, K. T. (2022) Deepfakes and Domestic Violence: Perpetrating Intimate Partner Abuse Using Video Technology, Victims & Offenders, 17:5, 647-659. Available here.

² Criminal Justice and Courts Act 2015 (c. 2). Available here.