On August 2, 2016 a report was released on the proliferation of digital misinformation and the challenges such information presents to global society and security. The report was released by the Center for European Policy Analysis and considers misinformation as a type of global warfare. It describes how traffic among users on digital platforms is conducive to political manipulation. This report centers in on the potential threats of Russian information warfare in Eastern Europe.
Moreover, World Economic Forum has already added the proliferation of misinformation in the public sphere using digital means to the list of the great challenges of our time: ”The global risk of massive digital misinformation sits at the center of a constellation of technological and geopolitical risks ranging from terrorism to cyber attacks and the failure of global governance.”
In 2012, the Berkman Center for Internet & Society at Harvard University published a report on the Russian cyberspace and its possible vulnerability to digital control. Even though the government more heavily controls the traditional printed media than the digital media, the report claims that the possibility of taking part in digital political activism by sharing viewpoints, which is more cost-free than in the traditional media, paves the way for new possibilities of control and manipulation strategies aimed at the digital public. When the costs of participation are low to non-existent it is easier for the average citizen to chip into political debate on the digital platforms available for such exchanges. This again leaves governmental powers with new venues for distributing propaganda, monitoring and possibly interfering.
The pre-digital propaganda has time and again fostered false narratives or the spreading of untruthful information through the pivotal nodes of modern society: newspapers, political speeches or other forms of one-way communication. With the social media and the internet – our digital world – has become a flow of network interaction and information for individuals as well as crowds. Everyone now has a bullhorn to the digital public. Everybody is now a broker of information but the information doesn’t have to be true to be traded online.
The new battlefield for propagandistic movements – nations or rebel groups – may inundate the digital arena with specific narratives or stories working to their advantage. They are narratives for us to see and we might even share them with others – as an influential voice in our ultra-local digital network. From social psychology we know information-based cascade effects, which inexpediently may result in everyone in a group believing something that is not the case. Worse, every member may not think so to begin with. But the individual mind is convinced about such and so because everyone thinks that such and so is the general opinion in the network. Such a state – pluralistic ignorance – may be cunningly utilized on the digital battlefields of opinion. If one can get specific sections of the population – some crucial swing voters – to believe that something is the case only because they are led to wrongly think that everyone else believes it (and then adapts to the presumed crowd opinion), then that just may be what it takes to beat the democratic election system.
We know from studies that Twitter is being (mis)used to create profiles which automatically generate or share ’tweets’ – these profiles are also called ’bots’ (abbreviation of software robots). As Ferrara et al. writes:
social media ecosystems populated by hundreds of millions of individuals present real incentives – including economic and political ones – to design algorithms that exhibit human-like behavior. Such ecosystems also raise the bar of the challenge, as they introduce new dimensions to emulate in addition to content, including the social network, temporal activity, diffusion patterns and sentiment expression. A social bot is a computer algorithm that automatically produces content and interacts with humans on social media, trying to emulate and possibly alter their behavior. Social bots have been known to inhabit social media platforms for a few years.
Thus one may inundate specific parts of networks on Twitter, which constitute a significant portion of the total information supply in those networks one wants to target. Some bots are so convincing that it is extremely hard to distinguish them from those profiles, which are made by real human beings. If one can mobilize a network, which is apt enough to create a ’shitstorm’ where real humans jump the bandwagon, then the scandal is mobilized unrestrainedly. At their worst, these bots are simply there to fuel the fire.
In the aforementioned Harvard report it is noticed how there in the Russian public sphere is proof that a specific network cluster of Twitter-users supporting Medvedev’s modernization program is popular primarily because of its bots and Twitter-users specifically promoting their own presence. Moreover, there is evidence to the effect that Twitter-bots were used in 2011 to quantitatively outcompete anti-Kremlin tweets in order to divest them their potential political impact and influence on specific Twitter-networks. Bots may thus be said to be a sort of quantitative diluters of truthful or legitimate or simply actually held political messages. Bots dilute specific points of view by way of quantitatively hijacking specific networks. It is not a direct censoring of the public, but rather an attempt to drown out other stances to the point of neglect or downright disappearance.
Bloomberg Businessweek earlier this year interviewed the political hacker Andrés Sepúlveda who for a decade in Latin America until 2015 ”led a team of hackers that stole campaign strategies, manipulated social media to create false waves of enthusiasm and derision, and installed spyware in opposition offices”. Another example of possible digital manipulation is the Republican politician Newt Gingrich – who besides having been one of Donald Trump’s loyal supporters, chairman of the House of Representatives in 1995-1999 and Time Magazine’s Man of the Year in 1995 – was accused of buying himself Twitter followers since 92% of his 1.3 million followers were revealed as fake profiles.
Here is a dawning manipulation of the digital public sphere that may be initiated by nations against other geopolitical areas (for example, the Eastern European countries’ public spheres could be attacked by pro-Russian flows of information). Terror groups as ISIS can spread narratives about the moral decadence of the West and their hatred towards the European intellectual history of free speech and thought.
The English scientist, philosopher and statesman of the Enlightenment, Francis Bacon, wrote in a slogan that knowledge is power. Put crudely, since Plato the distinction between knowledge and belief has been that knowledge must be truthful while no such criteria necessarily is attached to belief, opinion or whatever one is informed about. Bacon thought that true belief reliably derived – thus knowledge – was an instrument of power since military strategies rest on the pragmatic knowledge of artillery. Today, in the digital age, information is an instrument of power and manipulation which neither has to be truthful nor reliably derived or aligned with the facts.
The amount of information properly targeted at us may alone knock over our aspirations to engage in reality-touching politics, societal and cultural thought. When the ‘infostorm’ blows missiles and mines seem like sticks and stones. As such, some influential political leaders often neglect the facts as long as their narratives yield them power. Then democracies as we know them become fragile – they become postfactual – and that renders harsh conditions for stable democratic order as we know it.
A conjecture to the end: The post-factuality of modern political discourse and action is not the handling of facts, but the denial of them. Very few politicians, if any, would deny the assassination of Martin Luther King, Jr. whereas – today – parts of the electorate may applaud the denial or neglect of very-established scientific facts such as climate change, the scientific falsification of creationism or that Obama is not a rightful US citizen. In deference to factual democracy one must insist on fact-checking as, for example, the Pulitzer Prize-winning platform politifact.com do. It shows that Donald Trump has only uttered 4% completely true statements since December 2015. Therefore, it is a win for factual democracy whenever emphasis on handling facts in whatever way is promoted. It is a small step towards democracy, but the first step to be taken towards coherent democratic discourse, action and thought.
Vincent F Hendricks is Professor of Formal Philosophy at the University of Copenhagen, Denmark, Elite Researcher of the Danish State and Director of the Center for Information and Bubble Studies funded by the Carlsberg Foundation. He is the author of many books, among them Infostorms (Springer Nature 2016), Mainstream and Formal Epistemology (Cambridge University Press, 2007) The Convergence of Scientific Knowledge (Springer, 2001). He is also the author and editor of numerous papers and books on bubbles studies, formal epistemology, methodology and logic. Hendricks was Editor-in-Chief of Synthese and Synthese Library 2005-2015.
Joachim S. Wiewiura is a researcher at Center for Information and Bubble Studies, University of Copenhagen, where he works on conceptions of the public space in relation to political philosophy, architecture and information structures. He has taught philosophy of science and philosophy of technology, and co-edited last year the anthology On the Facilitation of the Academy and is the co-director of the international conference series on the Academy.