News | Press Release

1 February 2024

Refuge response to Online Safety Act law changes around AI-generated intimate image sharing

Emma Pickering, Head of Technology-Facilitated Abuse and Economic Empowerment at Refuge, said:

“Refuge welcomes the law changes around AI-generated images that have come into effect, as part of the Online Safety Act, which make it illegal to both share and threaten to share intimate AI-generated images without consent. Refuge campaigned for more than two years with colleagues in the sector to shape the Online Safety Act, into legislation that will better protect women and girls from misogynistic online harms.

Whilst we are closely following developments as the Online Safety Act gets rolled out, especially in how regulator Ofcom will oversee protections for women and girls we would like to acknowledge this vital first step in legislation recognising the huge threat AI presents to women and girls when it is used as a weapon by perpetrators to abuse survivors.

The rise of generative AI means that it has become easier than ever to create fake image of women and girls, as tool for abuse, and while there has been lots of public debate around fears of AI-generated images, or ‘deepfakes’, manipulating politics and altering public opinion, the most commonly shared deepfake images on the internet are non-consensual sexual depictions of women.

Intimate image abuse has a profound and devastating impact on survivors, whether the images are real or not. This form of abuse can harm a person’s physical safety, sexual autonomy and wellbeing by causing psychological trauma, feelings of humiliation, fear, embarrassment and shame. The sharing of intimate image deepfakes can also damage a person’s reputation, impact their relationships, and affect their employability. This is especially true, as the sophistication of technology, has made it increasingly difficult to recognise real images from those that are fake. We know coercive control is embedded in this form of abuse and the links between domestic abuse cannot be overlooked. Many of these images are being created and shared by men known to the victim/survivors of this crime.

While the sharing of intimate images has been illegal since 2015 and following the success of Refuge’s ‘The Naked Threat’ Campaign, threatening to share intimate images has been a crime since 2021, perpetrators are rarely held to account. Police are not responding to incidents of intimate image abuse in a way that reflects the serious nature of this crimes. It is our hope at Refuge that the criminalisation of intimate image ‘deepfakes’ and reform of wider intimate image abuse law will help survivors of this insidious form of abuse get the justice they deserve when reporting these horrendous cases of intimate image abuse.”