Refuge welcomes Government action to tackle deepfake abuse but warns that more must be done to protect survivors

Responding to the Government’s announcements on deepfake intimate image abuse, Emma Pickering, Head of Technology-Facilitated Abuse and Economic Empowerment at Refuge, said:  

“As the UK’s largest specialist domestic abuse charity with a dedicated Tech-Facilitated Abuse and Economic Empowerment team, Refuge sees firsthand the devastating impact that intimate image abuse has on survivors. AI-generated intimate image abuse is escalating at an unprecedented rate, with recent reports of a surge in non-consensually created images on X illustrating the alarming pervasiveness of this kind of abuse in online spaces.  

Refuge welcomes the news that the Government will now bring into force the offence of creating or requesting the creation of non-consensual deepfake intimate images this week, to ensure that survivors are protected without delay. The decision to make this a priority offence under the Online Safety Act rightly signifies the seriousness of this abuse and means that tech companies must take proactive steps to prevent it from occurring on their platforms. Despite this, we know that conviction rates for all intimate image abuse offences are woefully low, and for the new law to have teeth it must be backed by trauma-informed training for police and prosecutors to ensure perpetrators can be effectively brought to justice. 

The planned ban on ‘nudification’ apps, to be enacted through the Crime and Policing Bill, is another welcome step towards targeting the problem at its root by making it illegal for tech companies to provide the tools used by perpetrators to create non-consensual AI images, but the devastating reality is that survivors cannot afford to wait. Tech platforms are continuing to prioritise profit over the protection of women and girls and we need to see a clear timeline for the delivery of this ban to ensure that it is put into practice quickly and effectively. 

Ofcom’s much-needed investigation into whether X has complied with its duties under the Online Safety Act must be swift and decisive – any delay will leave women at risk of further abuse, while the platform continues to profit from their objectification and degradation. How far the regulator goes in holding it to account will send a clear message to women and girls about how effectively the online safety regime can protect them from online harms.   

While the Government’s announcements this week are a welcome step forwards, to achieve its goal of halving violence against women and girls (VAWG) by 2034, it must go further to tackle the growing threat of tech abuse. Ofcom’s guidance on online VAWG is currently voluntary – but this is not enough. Refuge is calling on the government to upgrade it to a mandatory code of practice, ensuring tech companies are required to comply or face enforcement action. If online harms are not treated with the urgency they demand, women and girls will continue to pay the price.” 

 

If you have been affected by intimate image abuse, including AI-generated images, you are not alone. There are some things you can do: 

If you are worried someone might be monitoring your devices, please take action from a safe device. Learn more about keeping your technology safe here.