This page was exported from David Olliver
Export date: Tue Oct 24 7:28:42 2017 / +0000 GMT
The Computational Propaganda Research Project (COMPROP) investigates the interaction of algorithms, automation and politics. This work includes analysis of how tools like social media bots are used to manipulate public opinion by amplifying or repressing political content, disinformation, hate speech, junk or fake news.
In their most recent report COMPROP have identified how organisations, often with public money, have created a system to help ‘define and manage what is in the best interest of the public.'
COMPROP have compared such organisations across 28 countries, created an inventory system and logged the kinds of messages, valences (positive or negative messaging) and communication strategies used. They have also catalogued organisational forms and evaluated their capacities in terms of budgets and staffing.
This article focuses on the use of cyber-troops.
The basic finding include:
The report mentions that “In January 2015, the British Army announced that its 77th Brigade would “focus on non‐lethal psychological operations using social networks like Facebook and Twitter to fight enemies by gaining control of the narrative in the information age”. The primary task of this unit is to shape public behaviour through the use of “dynamic narratives” to combat the political propaganda disseminated by terrorist organisations. The United Kingdom is not alone in allocating troops and funding for influencing online political discourse. Instead, this is part of a larger phenomenon whereby governments are turning to Internet platforms to exert influence over information flows and communication channels to shape public opinion.”
What is of concern in the report is that Cyber troops use a variety of strategies, tools and techniques for social media manipulation. Generally speaking, teams have an overarching communications strategy that involves creating official government applications, websites or platforms for disseminating content; using accounts—either real, fake or automated—to interact with users on social media; or creating substantive content such as images, videos or blog posts. These teams engage in sending pro‐government, positive or nationalistic messages when engaging with the public online. Other teams will harass, troll or threaten users who express dissenting positions.
Other, more popular forms of individual targeting involves various forms of harassment. This generally involves verbal abuse, hate speech, discrimination and/or trolling against the values, beliefs or identity of a user or a group of users online. Of course, some governments will use this type of harassment during important political events, namely, elections.
In addition to official government accounts, many cyber troop teams run fake accounts to mask their identity and interests. This phenomenon has sometimes been referred to as “astroturfing”, whereby the identity of a sponsor or organisation is made to appear as grassroots activism (Howard, 2003). In many cases, these fake accounts are “bots”—or bits of code designed to interact with and mimic human users. According to media reports, bots have been deployed by government actors in Argentina (Rueda, 2012), Azerbaijan (Geybulla, 2016), Iran (BBC News, 2016), Mexico (O'Carrol, 2017), the Philippines (Williams S, 2017), Russia (Duncan, 2016), South Korea (Sang‐Hun, 2013), Syria (York, 2011), Turkey (Shearlaw, 2016) and Venezuela (VOA News, 2015).
Some cyber troop teams create content to spread certain political messages. This content creation amounts to more than just a comment on a blog or social media feed, but instead includes the creation of content such as blog posts, YouTube videos, fake news stories, pictures or memes that help promote the government's political agenda. In the United Kingdom, cyber troops have been known to create and upload YouTube videos that “contain persuasive messages” under online aliases (Benedictus, 2016).
Government‐based cyber troops are public servants tasked with influencing public opinion. These individuals are directly employed by the state as civil servants, and often form a small part of a larger government administration. The report finds that “cyber troops can be found across a variety of government ministries and functions.” GCHQ is one such department.
The Australian Coalition Party used social media during its 2013 campaign to manipulate the public by using fake accounts to artificially inflate the number of followers, likes, shares or retweets a candidate receives, creating a false sense of popularity.
In Israel, the government actively works with student volunteers from Jewish organisations or other pro Israel groups around the world (Stern Hoffman, 2013). In many cases these top-performing volunteers awarded scholarships for their work (Stern‐Hoffman, 2013).
The report concludes:
“I don't think people realise how much governments are using these tools to reach them. It's a lot more hidden,” Samantha Bradshaw, the report's lead author told Bloomberg, noting the prominence of social media manipulation among democratic governments.
“They are using the same tools and techniques as the authoritarian regimes,” Bradshaw said. “Maybe the motivations are different, but it's hard to tell without the transparency.”
In the meantime, it should not be forgotten that whilst on the one hand governments around the world, including Britain are actively engaging in online public manipulation, Theresa May, the prime minister, has already asked governments to unite to regulate what tech companies like Google, Facebook and Twitter allow to be posted on their networks. The EU has already clamped down with calls that they are effectively shutting down free speech as apposed to curtailing hate speech, whilst engaging in exactly that – hate speech.
Whilst you might expect some governments around the world such as Azerbaijan, China, Israel and North Korea to be engaging cyber-troops to manipulate pubic opinion, you would not expect other western democracies such as the USA, UK or Germany to be doing so. But then again, these very same countries have built massive 360 degree mass surveillance systems without any public debate at all.
Excerpt: The Computational Propaganda Research Project (COMPROP) investigates the interaction of algorithms, automation and politics. This work includes analysis of how tools like social media bots are used to manipulate public opinion by amplifying or repressing political content, disinformation, hate speech, junk or fake news.
Post date: 2017-07-22 06:27:35
Post date GMT: 2017-07-22 06:27:35
Post modified date: 2017-07-22 06:31:31
Post modified date GMT: 2017-07-22 06:31:31
Powered by [ Universal Post Manager ] plugin. MS Word saving format developed by gVectors Team www.gVectors.com