Response to Ofcom’s ‘Guidance: A Safer Life Online for Women and Girls’ consultation
Summary
Refuge has responded to the consultation on Ofcom’s draft guidance aimed at protecting women and girls online, drawing on its organisational expertise, including insights from its panel of survivors and dedicated Technology-Facilitated Abuse and Economic Empowerment (TFAEE) team. While the guidance represents welcome progress towards tackling misogyny in digital spaces, particularly through the inclusion of online domestic abuse as a key harm, Refuge is calling for it to be strengthened.
The need to tackle online Violence Against Women and Girls (VAWG) could not be more urgent. Refuge’s Unsocial Spaces report (2021) found that one in three UK women had experienced online abuse, and one in six of these were abused by a partner or ex-partner. As the UK’s largest domestic abuse service provider, Refuge’s experience supporting tech abuse survivors highlights the critical need for regulation. Cases have increased both in volume and complexity, with a 207% rise in referrals to the tech team between its first year of service and 2024.
Refuge welcomes the guidance’s encouragement for tech providers to take ambitious, meaningful action to address online gender-based harms. However, to be truly effective, companies must be held to account when they fail to comply.
This is why, alongside others in the sector, Refuge is calling for Ofcom to work with the government to upgrade the guidance to a binding statutory Code of Practice.
1 in 3 UK women have experienced abuse online
1 in 6 of these were abused by a partner or ex-partner
207% rise in referrals to the tech team between its first year of service and 2024
Key strengths
Refuge would like to highlight the following strengths within the draft guidance:
- Recognition of coercive and controlling behaviour, which has been designated a priority offence under the Online Safety Act.
- A strong focus on safety by design, placing significant emphasis on preventing VAWG, as well as responding to it effectively.
- The inclusion of online domestic abuse as a ‘key harm’, marking an important step in acknowledging the specific risks women face in digital spaces.
Refuge’s recommendations
Stalking
Despite being designated a priority offence under the Online Safety Act, stalking is not adequately addressed in the draft guidance. Stalking is a tactic often used by perpetrators of domestic abuse and, last year, was one of the two most common issues reported to Refuge’s TFAEE team. Refuge urges Ofcom to include stalking in the current list of four ‘key harms’ to ensure it is appropriately prioritised.
Cooperation with law enforcement
Ofcom’s guidance should set clear expectations for how tech companies engage with law enforcement. Survivors and criminal justice agencies often face significant challenges when trying to access the digital evidence needed to support investigations. Yet tech companies hold vital data that can provide proof of abuse occurring on their platforms. These companies must be required to share such evidence promptly when legally requested.
Embedding independent VAWG expertise
Refuge recommends establishing an Ofcom Advisory Group comprising representatives from tech companies and experts in VAWG, including survivors. This group should act as an ongoing forum to share insights, discuss guidance, monitor merging threats, and support timely, coordinated responses to online VAWG.
In addition, Ofcom should commission and support the development of specialist training for tech companies to ensure the VAWG guidance is implemented effectively.
Intersectionality
Refuge’s research shows that some groups of women are disproportionately targeted with online abuse. According to the Unsocial Spaces report, 75% of LGBTQ+ women surveyed had experienced online abuse, alongside 45% of women from ethnic minority backgrounds. These figures highlight the urgent need for tailored, inclusive responses to online harm.
The draft guidance must be strengthened to ensure tech companies understand how to respond to intersectional inequalities, including those related to disability and neurodivergence, race and ethnicity, immigration status, sexual orientation, and gender identity.
Ofcom should require platforms to embed intersectionality across platform architecture and content moderation, as well as in the design, implementation, and evaluation of safety systems.
Preventing harm
Ofcom’s guidance must strengthen its approach to encouraging ‘safety by design’. For example, Refuge recommends that features such as ‘people you might know’ or ‘suggested people to follow’ – commonly used across major social platforms – be made optional rather than default. These features carry a significant risk of exposing a survivor’s profile to their perpetrator.
On social media platforms, auto-play and autocomplete functions can inadvertently amplify misogynistic content, even when users have not specifically searched for it. Recommendation algorithms – including those used by search providers – must be safety-informed and abusability tested to prevent harmful content from being promoted.
The proliferation of audio-visual generative AI (GenAI) tools has enabled the rise of non-consensual intimate image (NCII) abuse, including through ‘nudification’ apps and deepfake technology. Refuge welcomes the proposals for search services providers to delist and deprioritise websites that facilitate the creation and sharing of non-consensual intimate content.
Refuge also strongly supports the recommendation that tech companies should scan for duplicates of all non-consensual explicit material, and ensure such content is delisted from search results.
Preventing and responding to online VAWG
Survivors must be given tools that make it easier to stay safe online. Currently, many platforms require users to report each abusive account and piece of content individually – an approach that is both time consuming and re-traumatising. Refuge recommends that platforms develop functionalities that enable users to block and mute multiple accounts simultaneously.
Refuge also recommends that Ofcom encourage tech companies to use data such as IP addresses and mobile phone numbers to identify and prevent abusers from creating new accounts after being banned – an all-too-common tactic used to continue harassment.
In Refuge’s submission to Ofcom, a member of its Survivor Panel, said: “You need to make it as burden-light as possible for the survivor by taking reports of abuse from survivors seriously and responding quickly. When we do report, we want to be regularly updated with progress, understand who has seen the report and what the next steps are. We need human moderators and contacts.”
In addition, Refuge calls for the guidance to include clearer expectations around the safe and responsible development of GenAI tools. In internal testing carried out in March and April 2025, Refuge found that some AI models produced inappropriate or insensitive responses, when prompted with scenarios involving survivors seeking help, support, or empowerment. It is essential that AI systems are safety-tested, informed by survivor experiences, and trauma-aware.
Enforcement, responsibility and centring perpetrator accountability
A member of Refuge’s Survivor Panel highlighted the frustration that survivors often bear the burden of taking action or effecting change: “It is really frustrating that it boils down to the person who is being abused to take action or change. It has always been down to me. The focus should be on the perpetrator. There should be concrete consequences to their abusive actions, particularly their freedom to use tech.”
The guidance must prioritise stopping perpetrators and making them accountable for their actions, rather than placing the onus solely on survivors to protect themselves.
Refuge calls for clear standards requiring tech companies to hold perpetrators – especially repeat offenders – to account, with clear consequences for abuse. While the implementation of ‘timeout features’ for users who repeatedly perpetrate online gender-based harms is welcome, the guidance should also recommend stronger deterrents for repeated offences, such as platform bans.
Refuge welcomes proposals for companies to take ‘appropriate’ action when online gender-based harm occurs but emphasises that such action must be timely, as speed is essential to mitigating harm to survivors.
Ofcom’s guidance should further underscore that tech companies should remove harmful VAWG content while investigations are ongoing. In cases of reported domestic abuse, offending accounts should be suspended pending investigation. Additionally, tech companies must be encouraged to proactively identify and take down VAWG content.
Data sharing
Ofcom should require tech companies to collect and share data with the regulator on the perpetration of online VAWG occurring on their platforms. Improved data collection will support enforcement and mitigation strategies and enhance awareness across the tech sector about how their platforms are being exploited to perpetrate harm.
-
Secure your tech
Abusers may use technology to control, harass or intimidate you. We empower survivors to secure their tech and take back control of their lives.
Read more
-
Refuge responds to the concerning news that review of domestic abuse-related deaths are facing serious delays
Read more
-
Refuge empowers charities to tackle tech abuse through specialist training
Read more