Charities’ dismay as Twitter disbands safety group

Sign on the exterior of Twitter headquarters in San Francisco, CaliforniaGetty Images

Twitter has disbanded the volunteer group which advised it on self-harm, child abuse and hate speech.

The Trust and Safety Council, formed in 2016, contained about 100 independent groups such as Samaritans and the UK Safer Internet Centre (UKSIC).

Elon Musk’s Twitter had been due to meet the group on Monday, but instead disbanded it via email.

A UKSIC spokesperson told the BBC the charities were dismayed by the “sudden end” to the Council.

“We will continue to offer our many years of experienced assistance to those who want it and will continue to ensure we do what is necessary to protect internet users from harm,” they said.

The BBC also spoke to Glitch UK, a charity seeking to end online abuse, with a particular focus on black women and marginalised people.

Its spokesperson said the Council was disbanded via email, hours before a planned meeting was due to take place: “That is a really worrying trend when you’re thinking about trust and safety, work that demands a huge amount of expertise or experience.”

They said their work at Glitch UK has shown that abuse and misinformation could be prevented, or at least mitigated, with the right investment and collaboration with experts.

“But we’re not seeing that from Musk’s leadership,” they said.

“His self-professed commitment to free speech has actually led to the censoring, particularly of those people who are marginalised, who are targets of disproportionate abuse like black women, who are now fearful of speaking out on the platform because of the higher levels of abuse.”

Three resignations

On 8 December, three members of the Council resigned, citing concerns over Mr Musk’s leadership.

At the time Anne Collier, who had been a member since the group was formed, said Twitter was heading in the wrong direction.

“Having followed the research on youth online risk since 1999, I know how hard it is for platforms to get it right,” she said.

“But some progress has been made in the industry. Tragically, the research shows that Twitter is going in the opposite direction, and I can no longer find a reason to stay in tacit support of what Twitter has become.”

2px presentational grey line

Analysis box by Marianna Spring, Disinformation and social media reporter

Since buying Twitter, Elon Musk has been leaving somewhat obvious clues about the new approach he wants to follow at the social media site.

These include welcoming back previously suspended accounts, sacking employees, scrapping the site’s Covid-19 misinformation policies, and releasing what are called the ‘Twitter Files’, which purport to show the site was unfairly censoring certain accounts.

There are Musk’s tweets, too, railing against the “woke virus” and taking aim at the departing Chief Medical Adviser to the US president – comments that have fanned the flames of abusive conspiracy movements for which he was already a target.

Abolishing this Trust and Safety Council is a further indication of Twitter’s new approach to hate speech and content that could harm vulnerable people.

But it’s worth remembering why it was established in the first place – because the platform was coming under mounting pressure to deal with harmful content. That pressure hasn’t gone anywhere – in fact it’s arguably greater than before.

Those I’ve spoken to who have worked at Twitter tell me they’re concerned that efforts to tackle abuse and disinformation are being undone.

Everyone’s waiting to see to what extent Musk’s approach is a careful calculation, involving prioritising what he calls freedom of speech, or whether it’s just the product of chaos at his new company – or indeed a combination of the two.

2px presentational grey line

According to the email sent to group members and seen by AP News, Twitter said the Council was no longer the best structure for it to gain input on its policy decisions.

“Our work to make Twitter a safe, informative place will be moving faster and more aggressively than ever before and we will continue to welcome your ideas going forward about how to achieve this goal,” Twitter said.

‘Unfortunate and unacceptable’

One of the UK-based charities on the council was the Diana Award, set up in memory of Diana, Princess of Wales – it focuses on anti-bullying and mentoring young people.

Alex Holmes, Deputy CEO of the charity, told the BBC he sits on the board of Trust and Safety Councils at other social media companies including Twitch and Snapchat, and he believed strongly in their purpose.

“There are really good examples of teams who have engaged experts in a really positive way to better set policies and make platform safer,” he said.

“If you want to run a company that has healthy conversations, and that does provide safety to its users, then you consult experts, because there’s no way that a Trust and Safety team could possibly have a global understanding of every single issue and how it presents itself across the world.

“Simply when platforms are looking after numbers bigger than populations of countries, you have to consult with experts, you have to work together with individuals that have dedicated their life to protecting vulnerable, often marginalised communities.”

Mr Holmes confirmed that the board’s role was strictly advisory, only volunteering advice but never making decisions on policy.

“We were on occasions consulted by Twitter on particular policies that were being developed or thought about, or issues that were emerging, but we never had a final say.

“Ultimately it was the staff that made those final calls and decided policy and approach.”

The BBC has approached Twitter for comment.

Comments

Leave a Reply

Skip to toolbar