Ofcom vows to name and shame platforms over online sexism

Zoe KleinmanTechnology editor
BBCThe media regulator has published guidelines designed to make the internet safer for women and girls – and threatened to make it “absolutely clear to the public” which platforms are not adhering to them.
Ofcom says it hopes the measures will make it easier to report and act on online abuse, acknowledging that those processes are currently “soul destroying.”
However, they are recommendations rather than legal requirements, with the regulator hoping the threat of platforms being outed for not complying with them will compel them to act.
Critics say it and the government need to go further if they want to make the online world safer.
“Until we have legally enforced mandatory code of practice, we don’t think we’ll really see a shift in tech platforms taking this issue seriously enough,” said Andrea Simon, executive director of the End Violence Against Women Coalition.
Influencer and women’s sport advocate Demi Brown told the BBC she had been forced to “become resilient” in response to negative comments about her weight and appearance online.
She said it was wrong that she had to use the block button to remove abuse and prevent trolling on her social media accounts.
“I don’t think that we should be worried about the online space, it should be a place where we can authentically be ourselves,” she told the BBC.
‘Small steps’
Ofcom’s new guidelines announced on Tuesday include asking firms to:
- put all account privacy settings in one place
- de-monetise content containing sexual violence
- allow abusive comments to be reported collectively, not one-by-one as is currently the case
“It’s about making reporting much easier so that you can report multiple accounts that are abusing you at the same time rather than having to do them one by one, which is absolutely soul destroying,” said Ofcom boss Dame Melanie Dawes.
“It’s lots of small steps that together will help to keep people safer so that they can enjoy life online,” she added.
She insisted the threat of being called out would be a powerful one for tech firms.
“I think that the transparency that we’re going to bring to this will be a very strong incentive,” she said.
UK Technology Secretary Liz Kendall said tech firms “have the ability and the technical tools to block and delete online misogyny”.
The guidance complements previous codes, rules and guidelines issued by the watchdog as it enforces the Online Safety Act, which became law in 2023.

Sahra-Aisha Muhammad-Jones founded a running club for Muslim women in east London and said negative DMs and comments can put younger women off being online at all.
Despite having built a positive community around her, she said she still does not feel safe on the internet.
“There is the side to social media that is really harmful and really scary, and you have to be on alert all the time,” she told BBC News.
‘Some just won’t care’
Former secretary of state Baroness Nicky Morgan told BBC Radio 4’s Today programme it had been a “long battle” to see such measures established.
But she said seeing them emerge in the form of guidelines, rather than rules, for tech firms was “disappointing”.
“I think it gets some basic ground rules in place but of course, it does depend on the attitude of the tech platforms adopting the practical guidance put forward,” she said.
While some platforms may opt to do so, she said, “some just won’t care and will carry on with the deeply harmful content that we see online today”.
The concerns come amid wider criticism of the regulator for not having enough teeth.
So far Ofcom has issued only two fines for breaches of the Act.
One of the fined platforms, 4Chan, has refused to pay its £20,000 penalty and launched legal action in the US.
Walking a tightrope
Ofcom is trying to walk a tightrope between online safety and freedom of speech. It is also dealing with US-based tech giants which own the UK’s most popular social networks.
US Vice President JD Vance said earlier this year that the White House was growing tired of other countries trying to regulate American tech businesses.
Ms Kendall wrote to Ofcom recently saying it was in danger of “losing the public’s trust” if the pace of change didn’t pick up, and campaigners like the Molly Rose Foundation say the laws do not go far enough to protect people from online harm.
Chris Boardman, former pro-cyclist and chair of Sport England, complained to Ofcom in the summer about the treatment of women in sport online.
During last year’s Euro Championships, Lioness footballer Jess Carter was forced off social media because of online racial abuse.
Tennis star Katie Boulter, who received death threats following the French Open, also said abusive comments had become “the norm”.
In his letter, Mr Boardman said sexist online abuse of athletes counteracted efforts to encourage more women to take up sport.
“The action can be taken,” he told the BBC, “you’ve got AI [and] algorithms now that are ruthlessly targeting marketing to increase participation and profit”.
“We now need to use those same tools to curb the abuse in the first place rather than having to work with dealing with it after the fact,” he said.


