Twitter staff cuts leave Russian trolls unchecked

Twitter logoReuters

Hundreds of Russian and Chinese state propaganda accounts are thriving on Twitter after Elon Musk wiped out the team that fought these networks, the BBC has found.

The unit worked to combat “information operations”, coordinated campaigns by countries such as Russia, China, and Iran, made to influence public opinion and disrupt democracy.

But experts and former employees say the majority of these specialists resigned or were laid off, leaving the platform vulnerable to foreign manipulation. The BBC has spoken to several of them. They asked for anonymity, citing non-disclosure agreements and threats they received online.

“The whole human layer has been wiped out. All Twitter has left are automated detections systems,” a former senior employee said.

In a BBC interview on Tuesday, Musk claimed there was “less misinformation [on Twitter] rather than more” under his tenure. He did not comment on active state troll farms on the platform nor the team that used to fight them.

We approached Twitter for comment but received no response other than a poo emoji – the standard auto-reply from the company to any press enquiry.

‘Troll farms’

Organised groups of people posting coordinated messages are called ‘troll farms.’ The term was first used by Russian reporters who exposed one of roughly 300 paid employees run by Yevgeny Prigozhin, head of the Wagner mercenary group.

Since then, troll farms influencing elections and public opinion have been uncovered in many countries, from Poland and Turkey to Brazil and Mexico. They have also been used as a propaganda tool in ethnic conflicts and wars.

cartoon promoting Russia's Wagner Group

Now, a new group of Russian trolls is active on Twitter.

It supports Putin’s war in Ukraine, ridicules Kyiv and the West, and attacks independent Russian-language publications, including the BBC Russian Service. Many of these trolls’ accounts have been suspended, but dozens are still active.

Darren Linvill, associate professor at the Clemson University Media Forensics Hub in South Carolina, says the network appears to originate from Prigozhin’s troll factory.

Mr Linvill and his colleagues have also discovered two similar Russian-language troll networks, but from an opposite camp. One tweets in support of Ukraine, and another promotes Russian opposition, including the jailed Putin critic Alexey Navalny.

While they have all the markings of troll accounts, including random numbers in the Twitter handles and coordinated behaviour, these networks appear to remain undetected by the platform.

The Clemson University team is also tracking pro-Chinese accounts targeting users in both Chinese and English about issues of importance to the Chinese government.

With only a skeleton crew remaining, Twitter does not have resources to swiftly detect, attribute and take down this foreign propaganda, according to former employees.

While the platform also established partnerships with research institutions that detected information operations, scholars say they have not heard anything from Twitter since November.

Punching above its weight

Experts have long warned about the dangers of foreign influence on social media.

In 2018, the FBI said that fake accounts impersonating real Americans had played a central role in the Russian effort to meddle in the 2016 election. That was when Twitter and Facebook started hiring “information operations” specialists.

Lakhta-2

“I still remember the rage I felt when I saw accounts with names like “Pamela Moore” and “Crystal Johnson” purporting to be real Americans from Wisconsin and New York, but with phone numbers tracing back to St Petersburg, Russia,” recalls Yoel Roth, Twitter’s former Trust and Safety head.

Twitter has a fraction of Facebook’s reach and budget. But over the years, it built a small but capable team. While it could not match the resources of its rival social network, Twitter “nonetheless punched above its weight”, says Lee Foster, an independent expert in information operations.

Twitter hired people with backgrounds in cybersecurity, journalism, government agencies and NGOs who spoke an array of languages including Russian, Farsi, Mandarin, Cantonese, Spanish and Portuguese.

One former investigator says: “We needed people who would be able to understand: if Russia is likely to be the responsible actor behind this, what is its motivation to do this particular operation?”

He says he resigned because his team did not fit into ‘Twitter 2.0’ that Musk was building.

“Our role was to help make the use of Twitter as safe as possible. And it did not feel like that was likely to continue as a priority.”

Countering propaganda worldwide

The team worked in close contact, but separately from the ones countering misinformation. That is because state-run campaigns can use both fake news and factual stories to promote their messages.

In 2016, Russian trolls targeted black voters in the US using real footage showing police violence. And in 2022, a coordinated network promoted negative – but sometimes accurate – news about the French contingent and United Nations missions in Africa’s Sahel region.

Both networks were taken down by Twitter.

young black man in post

Computational Propaganda Research Project

As similar information operations were conducted on different platforms, Twitter employees met with their peers at Meta and other companies to exchange information.

But at such meetings, Twitter’s investigators would be reminded of how small their operation was. “Their team would be ten times the size of ours,” says an investigator.

Now even those resources are lacking.

Without the team dedicated to fight coordinated campaigns, Twitter “will slowly become more and more unsafe,” says Linvill of Clemson University.

With additional reporting from Hannah Gelbart.

Comments

Leave a Reply

Skip to toolbar