Manchurian Candidates | Sunday Observer

Manchurian Candidates

25 March, 2018

Novelist Richard Condon’s political thriller ‘The Manchurian Candidate’, written in 1959, deals with two central characters, both of whom are brainwashed through what’s often now called psychological operations or psy-ops for short. One character is programmed to kill based on a trigger – which in the novel is something as innocuous as the Queen of Diamonds card. Whenever the character sees the card or is shown it, he obeys orders, deeply uncharacteristic and extremely violent, which he does not consciously recollect. Condon’s novel resonates more in 2018 and among a broader population, than at any time since its publication.

Backgrounder

An undercover investigation by Channel 4 News revealed last week how the data analytics firm Cambridge Analytica secretly campaigns in elections across the world. Bosses of Cambridge Analytica, the firm credited with helping Donald Trump to presidential victory, were filmed talking about using bribes, ex-spies, fake IDs and sex workers.The Channel 4 reporter posed as a Sri Lankan businessman wanting to influence a local election. The explosive documentary by Channel 4 comes in the wake of exposes in the New York Times and the Observer featured revelations by a former Cambridge Analytica employee, who described how the firm compiled user data off social media sites like Facebook to target American voters in 2016.

We are all Manchurian Candidates. Reacting emotively to things we see online, many of us immediately put into words or action what we feel, instead of thinking through what a more reasoned response should and can be. Sometimes, and especially fed misinformation over time, this leads to violence by those who never knew they would be drawn to it. Knowing and gleaning information on socio-political triggers can vastly help destabilise any political context – no matter how seemingly stable it appears to be - to an extent where the promise of security, stability and sanity is enough currency to elect even those previously deemed unsuitable for public office. Conversely, inconvenient histories and truths no longer need the murder of journalists or the burning of printing presses to suppress or erase. Vast sections of polity and society can today, over a relatively short period of time be manipulated and mobilised to drown out, decry, deny or violently destroy narratives too explosive to be written into history. Perversely, those seeking or speaking the truth are the most vilified. Those who deny facts are perceived or projected instead as bearers of truth. Weaponising a combination of high adult literacy and low media literacy, social media in particular is leveraged to spread rumour and stoke anxiety, in ways that even many discerning citizens can’t easily distinguish as propaganda or sophisticated psy-ops.

This is no longer the sole domain of fiction or Hollywood. It’s real. It’s happening. And it will grow.

The extent of the problem is worth capturing, even in passing. Every year, Adobe, the makers of the eponymous photo-manipulation program Photoshop, stage a massive conference, aimed at leading designers, programmers, architects, journalists, artists and others from across the world.

Over the past two years, technologies they have demonstrated, which will in a few years or less be part of Photoshop and other programs they make, have featured technologies that are absolutely fascinating and positively frightening in equal measure.

Videos that manipulate the mouth and face of the person on screen and in real time to say whatever you want them to say. Audio that can be manipulated using the same voice as the speaker who is recorded, to say anything you want said. Images that can turn a sunny day into a winter storm.

In sum, media digitally doctored so well, it is indistinguishable even to trained eyes from fact.

All of this have huge commercial and creative applications, of course – which is why they are being developed. But the implications of their use – inevitably and almost immediately – in political communications has very dangerous consequences.

Add to this suite of technologies the increasingly sophisticated attacks on electoral infrastructure, siphoning vital information, manipulating records, doctoring results on polls and elections, undermining public trust and confidence. The garnish on this nightmarish scenario is fake news, a term used and abused so much, it has lost its ability to capture the phenomenon it set out to capture – digital propaganda. The generation of narratives as smear campaigns against political opponents isn’t new. What’s new is the way in which digital content is being targeted at voters – right down to the neighbourhoods they live in, what they buy and from where, to which God they pray and what news media they consume. This laser focus is complemented by the manipulation of fear. Framed and fuelled by sophisticated media campaigns that often produce seemingly amateurish, emotive output geared for mass appeal, these fears metastasize over time to deeply influence thinking, behaviour and responses. An election today is won or lost well before the exercise of franchise at the ballot box.

We are talking about the hacking of minds. And this isn’t science fiction.

The revelations last week by the UK broadcaster Channel 4 into the inner workings of the company Cambridge Analytica reveal a world that in Sri Lanka, many don’t even know exists. In January, data scientist YudhanjayaWijeratne and I revealed the degree to which Namal Rajapaksa had weaponised his Twitter account, to an extent where those who questioned his chutzpah, hypocrisy or humbug were viciously attacked over social media. It’s a different kind of censorship or silencing at play here. Think of a pirith chant or a choir in Church. Now think of either at a volume so great, everything else is drowned out. Imagine this happening over extended periods of time. In turning sublime harmony to sustained cacophony, vital narratives are erased before they are even recorded. This is what the weaponisation of social media achieves to critical public discourse. When Wijeratne and I warned about all this happening in Sri Lanka, there were those who scoffed at the idea. And yet, the Channel 4 investigation was a sting operation anchored to a “rich Sri Lankan family”, an entirely fictitious construct which tellingly was enough to galvanise the sustained interest of a company which is not in the business of meeting clients who represent markets it cannot exploit, have hard data on or make a good sell in. That Cambridge Analytica was so interested in Sri Lanka and spoke about what they have done elsewhere – including in the Trump campaign – to a prospective Sri Lankan client – speaks volumes to the degree our electorate, electoral systems, polity and society are valuable to them and by extension vulnerable to psy-ops, at scale.

This then is our new political reality, globally and locally. We are living in a time where a tectonic shift has already occurred. Mediators of the public will – technology companies from Silicon Valley almost entirely unaccountable to governments, are new platforms of democratic dialogue as well as demagogic destruction.

The platforms and the companies that own them, in their own defence, say they were vehicles of public opinion and do little to nothing to amplify individual narratives.

This is a risible lie at algorithmic, managerial, political and platform design levels. But the most visible harm these platforms engender come in the form of companies like Cambridge Analytica, who harvest social media, not through data breach or hacking, but by careful logging, targeting, observation and analysis.

They monetise and weaponise, by brokering vast amounts of private data, the very likes, shares, retweets, emoticons and comments we send each other, billions of times a day, every day. They do this invisibly – like ghosts, a word senior management of Cambridge Analytica actually use in the Channel 4 documentary to describe how they do what they do.

How then do we protect ourselves, and restore public faith in a truly democratic dialogue and the legitimacy of electoral processes? No quick fix or panacea, sadly. A public conversation – urgent, honest and sustained – needs to happen between government, technology giants and civil society, around ways through which the worst abuse of technology can be mitigated. This needs to be global and local. Media and information literacy needs to be part of school curricula. Our children need to be taught how to engage with media, in ways as adults, we were never warned against, through technologies we never had growing up and many still don’t understand.

Fear can motivate the search for responses but must never overtake a democratic impulse or inform policies that censoriously regulate. We – you and I - are at the heart of the problem. Every like, heart, share, retweet, email, star, comment on or story we tell others we first saw on social media, is often to promote – consciously or unwittingly - sinuous lies or rumours that fuel fear and violence.

To reflect first, not react in haste and to question in order to quell are the keys to unravelling our world of misinformation. Cambridge Analytica, Facebook and others like them treat us as pawns in a game of their making. An informed citizenry and consumer can and must change this. There is no task more important, to my mind, than this.

Sanjana Hattotuwa is a doctoral student at the University of Otago, and the Founding Editor of www.groundviews.org

Comments