By Jess Eagelton, Policy and Public Affairs Manager
As Parliament returns from the summer recess, attention will be returning to the Online Safety Bill. This piece of legislation is reaching the final stages of its Parliamentary journey and is likely to soon become law. Refuge has been campaigning on the Bill, and against online abuse of women and girls, for several years. Thousands of our wonderful supporters have taken action to tell decision-makers to prioritise women and girls’ safety online. Technology-facilitated domestic abuse is a growing threat, and over a third of issues reported to our specialist technology-facilitated abuse team relate to social media. So, now this complex Bill is almost law, how will it change women and girls’ experiences online?
What does the Bill do?
Regulating social media
The Bill introduces regulation of online spaces such as social media sites like Facebook and search engines like Google. New duties will be placed on tech companies like these to ensure their users are safe. Social media platforms will need to:
- Take down illegal content from their sites.
- Prevent and remove ‘priority’ illegal content – the government has listed different types of criminal offence that are considered ‘priorities,’ including some domestic abuse related offences such as stalking, harassment, coercive control and intimate image abuse.
- Operate accessible and easy to use reporting and complaints systems.
- Enforce minimum age requirements for users and prevent children from accessing age inappropriate content such as pornography.
There are also additional rules for larger and higher-risk platforms (likely to include sites like Facebook, Instagram and Twitter) who will need to:
- Set clear terms of service which outline their restrictions on legal content. Platforms must then remove content banned by their own terms and conditions.
- Provide tools to empower users to tailor the type of content they want to see, for example if they want to avoid seeing potentially harmful content on their feed.
- Ofcom, the UK’s regulator for TV and radio, will oversee these new regulations and will be given powers to investigate and fine tech companies for breaching the rules. The government will also hold senior managers at tech companies criminally liable if their company breaches certain rules.
- These new rules won’t come into force immediately – there will be an implementation period whilst Ofcom develops guidance for platforms on how they can meet their new duties.
New criminal offences
The Bill will also introduce some new criminal offences for online abuse and will reform the current law around intimate image abuse. Refuge has long campaigned for better protections for survivors of intimate image abuse, and our research has shown that 1 in 14 adults in England and Wales have experienced threats to share intimate images or videos. In 2020-21, we successfully campaigned for threats to share to be made a crime (with intent to cause distress). The Bill will change the criminal law to:
- Create a new base offence of intentionally sharing an intimate image without consent. This removes the need to prove the perpetrator’s motivation to cause distress.
- Create 2 more ‘serious’ offences of sharing an intimate image without consent, based on intent to cause ‘alarm, distress or humiliation,’ or to obtain ‘sexual gratification.’
- Create a specific offence of threatening to share intimate images.
- At the same time, the sharing of ‘deepfakes’ – explicit images or videos which have been digitally altered to look like someone – will also be criminalised.
- Cyberflashing will also be made illegal.
New offences of sending threatening communications and knowingly sending false communications to cause harm will also be introduced, to update the criminal law previously governing malicious communications. Intimate image abuse survivors will also have greater protections – including lifetime anonymity and automatic eligibility for special measures in court, such as giving evidence via video or behind a screen at trial.
When the Bill was first introduced to Parliament in 2022, there was not one mention of women or girls in its 225 pages. After campaigning for over a year, and working with a broad coalition of organisations including the NSPCC and EVAW, we were pleased that the government made two changes to the Bill.
Firstly, coercive control – a common form of domestic abuse – will be listed as a ‘priority’ offence in the Bill. This means tech companies need to both prevent and remove coercive control content from their sites.
Secondly, Ofcom will have to publish guidance for tech companies about protecting women and girls. Whilst our aim was for a Violence Against Women and Girls Code of Practice to be introduced – which would have been a strong, enforceable tool to hold social media platforms to account for women and girls’ safety – we hope this amendment will make a real difference for survivors and send a strong message to tech giants that violence against women and girls is a serious crime offline and online.
In addition, Ofcom will need to consult with the Domestic Abuse Commissioner and Victims’ Commissioner when drafting its guidance for tech companies.
What does the Bill mean for women and girls?
95% of survivors of tech-facilitated domestic abuse said they were not satisfied with the support they received from social media companies
4% charging rate for intimate image offences
Once the new regulation come into effect, our hope is that social media companies will pay greater attention to how perpetrators are using their platforms to abuse women and girls. For too long, survivors have told us they felt ignored and belittled by these companies. Our research has shown that almost all survivors of tech-facilitated domestic abuse (95%) said they were not satisfied with the support they received when they reported domestic abuse content to a social media company. Over half did not even receive a response to their report, and many were left waiting weeks, months or even years after requesting content be removed.
The Bill is an important step forward as Silicon Valley bosses should start to make improvements to their platforms to make them safer for users. This should include making reporting systems easier to use. The biggest companies will need to take down content which infringes their T&Cs – we know many companies sometimes say that reported content does not breach their community standards, even when it is clearly abusive, harassing or intimidating.
However, the Bill does not go as far in protecting women and girls as we want it to. It is disappointing that the new guidance on women and girls will not be as strongly enforceable as a Code of Practice would have been. Ofcom and the government need to ensure tech bosses are held to account if they aren’t making changes needed to protect women and girls online.
Whilst we welcome the criminalisation of cyberflashing, the new offence is based on proof of the perpetrator’s motive to cause the survivor harm or for sexual gratification. Intent and motivation can be hard to prove, and this may create a worrying loophole in the new law.
The reform of intimate image abuse law is also welcome. However, this needs to come with robust training in tech-facilitated domestic abuse to ensure police officers are confident in identifying intimate image offences and gathering evidence. Earlier this year we found that despite a steady increase in the number of intimate image offences recorded from year to year, the charging rate remains low at just 4% of all offences recorded.
As new technology continues to evolve, we are urging government to commit to addressing online violence against women and girls and to ensure that specialist support services are funded to support all survivors.