CORDIS - EU research results
CORDIS

Computational Propaganda: Investigating the Impact of Algorithms and Bots on Political Discourse in Europe

Article Category

Article available in the following languages:

The secret robot armies fighting to undermine democracy

Governments and special interest groups are using networks of automated accounts on social media to sow dissent, spread disinformation and subvert their opponents.

Society icon Society
Security icon Security

Funded through a European Research Council (ERC) grant, the COMPROP (Computational Propaganda: Investigating the Impact of Algorithms and Bots on Political Discourse in Europe) project set out to investigate networks of automated social media accounts, and their role in shaping public opinion. Researchers led by principal investigator Philip Howard produced a codified definition of ‘junk news’ that referred to deliberately produced misleading, deceptive and incorrect propaganda purporting to be real news. The team then examined millions of posts on social media to see how these messages were produced and disseminated. Though initially focused on Twitter, the team at the University of Oxford’s Programme on Democracy and Technology found computational propaganda – algorithms put to work for a political agenda – on Facebook, Instagram, Telegram, YouTube, and even dating app Tinder. “We didn’t expect over the course of the project the problem would grow as bad as it did,” notes Howard. “We can see how some governments, lobbyists, the far right and white supremacists all use these to manipulate democracies.” The COMPROP project focused heavily on COVID misinformation, which Howard notes came chiefly from three sources: Russian media, Chinese media, and American president Donald Trump. While Trump’s disinformation was tied to domestic American politics, Russia and China pushed three broad themes intended for foreign audiences. “The first was that democracy can’t help us, elected leaders are too weak to make decisions,” says Howard. “The second message was that Russian or Chinese scientists were going to get the vaccine first, and the third was that Russia or China was leading on humanitarian assistance efforts.”

Under the influence

These misinformation campaigns predate the COVID-19 pandemic, however. “When Malaysia Airlines Flight 17 was shot down over Ukraine, there were multiple ridiculous stories of what transpired – that democracy advocates shot it down, that American troops shot it down, that a lost tank from WWII came out of the forest and shot it down,” adds Howard. By laying out multiple conflicting stories, authoritarian regimes prevent their citizens from knowing which narrative to respond to. This strategy was eventually turned outward, to undermine social movements and destabilise foreign nations. “Sometimes campaigns are about a specific crisis or person, but often the goal is to undermine trust in courts, police, journalism, science, or government at large,” explains Howard. He adds that the target audience for these bots is perhaps only 10-20 % of the population, typically disaffected, conservative-leaning adults who are politically active. In a highly polarised country, swaying 10 % of the electorate can have a resounding impact. Howard explains that these campaigns are particularly bad for the role of women and minorities in public life: “Feminists, female journalists, and female politicians get a nasty form of attack and disinformation on social media. It’s much easier to drive a woman out of public life than a man.”

Government intervention

Howard says more effort is needed to contain these propaganda networks. “We’re past the point of self-regulation by industry. If tech firms stepped up, and governments imposed fines on politicians who commission these programmes, that set of initiatives would go a long way.” Yet even identifying which social media accounts are automated has proven difficult. “One bot writer in Germany said his team would read our methodology papers and adjust their algorithms to just below our catchment,” remarks Howard. “We were in a sort of dialogue with these programmers.” The group were also awarded a proof of concept grant to develop the Junk News Aggregator, a tool which interactively displays articles from unreliable sources as they spread on Facebook. Howard and his team are now focused on how machine learning technology will power a new generation of computational propaganda. “If someone can take your social media feed and behavioural data, and come up with political messages you’ll respond to, they’ll do that,” he concludes. “This is the next great threat.”

Keywords

COMPROP, computational, propaganda, bot, social media, political, COVID, algorithm, machine learning, Russia, China, Trump

Discover other articles in the same domain of application