Flame Wars or Bot Wars?

Anyone who’s spent much time in political debate on social media platforms can’t help but notice the deeply entrenched nature of the discourse. But how much of the bitterness is even instigated by human beings?

It’s recently been reported that nearly 700,000 Twitter users have actually been interacting with Russian bots, whose purpose may well be deliberate antagonism and division. 

This 2014 Adam Curtis piece for Charlie Brooker’s Newswipe outlines the work of a Russian politician Vladislav Surkov, whose strategy (influenced by a background in avant-garde art) is based around deliberate confusion, keeping populations constantly in a state of not knowing if we are coming or going. Ring any bells? 

In such a way, bots can be programmed to pump out strong messages to wind people up on either side of a debate, for example, Black Lives Matter vs Blue Lives Matter (in defense of the police so often caught up in the shootings of black US citizens). It’s not just classic divide and rule, the confusion also makes it easier for governments to carry out their programs behind the fog of uncertainty. 

Obviously, this may not just be a Russian phenomenon, and it might be said that some people are so predictable and cliche-ridden in their activity that they may as well be bots anyway. 

Twitter and Facebook have come under a lot of fire for “allowing” this to happen, but quite how even they can police billions of accounts effectively remains a mystery. 

Perhaps a more effective and less authoritarian long-term solution is to ensure that news consumers are better equipped to work out when we are being played, as recently proposed to MPs by Professor Stephen Lewandowsky of Bristol University. 

Debate on social media is increasingly influential, but the medium is still in its relative infancy. It’s as well to acknowledge that we may not always effectively navigate the pitfalls, especially as those pits are often dug by full-time professional tech and propaganda experts.

Stephen Durrant

The Media Fund