Refuge responds to government’s proposal to tackle intimate image abuse
Responding to the government’s new proposed offences to tackle intimate image abuse, Emma Pickering, Head of Technology-Facilitated Abuse and Economic Empowerment at Refuge said:
Refuge has campaigned extensively for intimate image abuse to be properly recognised by the justice system, and we welcome the government’s long-awaited measures to tackle the creation of intimate images without consent through new legislation.
Intimate image abuse is a deeply violating form of tech abuse which has a devastating impact on survivors, regardless of whether the images are real or not. Survivors of this kind of abuse are often left feeling socially isolated, with long-term negative consequences for their mental health and wellbeing.
We are pleased that the government will make the creation of sexually explicit ‘deepfakes’ a criminal offence, but we urgently need clarification on the specifics to ensure that survivors will be properly protected. We urge the government to take a consent-based approach to ensure that an offence is determined not by the perpetrator’s intention, rather by whether or not the victim consented to the image being made. Similarly, whilst we welcome the introduction of new offences to criminalise the taking or recording of intimate images without consent, we need to see further details, including how a perpetrator’s intention will be assessed where this is relevant.
It is of paramount importance that these offences are accompanied by measures to improve enforcement. Although the sharing of intimate images has been illegal since 2015 and following the success of Refuge’s ‘The Naked Threat’ Campaign, threatening to share intimate images has been a crime since 2021, perpetrators are rarely held to account. Data published by Refuge in 2023 showed that conviction rates for intimate abuse remain woefully low, with only 4% of cases reported to the police resulting in perpetrators being charged.
As a result of a huge and unacceptable gap in the law, often non-consensual images remain on perpetrators’ devices, even after a conviction. This is incredibly distressing for survivors, and it is crucial that any new legislation includes provisions to compel perpetrators to delete any non-consensual images.
We urgently need trauma-informed training for all sections of the justice system, from the police and CPS to the judiciary. This must include training on the intersection of tech-facilitated abuse and coercive control. We know that many non-consensual explicit images are created by men known to the survivor, and the link with domestic abuse cannot be overlooked.
Along with others in the sector, we urge the government not to wait until Spring to introduce these measures. Delays put survivors at risk and it is essential that survivors of all forms of intimate image abuse can get the access to justice that they deserve as swiftly as possible.