Pages

7 August 2018

Cyber-enabled information and influence operations—it’s not just Russia

Danielle Cave

Each month we learn more and more about the extent of Russia’s interference in the 2016 US elections. Fraudulent social media accounts accused of propping up non-existent political commentators, armies of Twitter bots designed to cluster around and drive defined political and social issues, carefully crafted ‘dark posts’ that only some could see, and political rallies coordinated by social media event pages, all are now standard media fodder. This sophisticated covert campaign used disinformation to sow confusion and magnify noise and disagreement. It prodded and promoted a lack of confidence in American leaders and institutions. It did so by taking advantage of the openness of American society—and by leveraging cyberspace in new and creative ways that outpaced and outfoxed government thinking. Given the lack of response from the US government during (and immediately following) the elections, and the seeming lack of awareness in media that events were being manipulated, it’s fair to say few understood the magnitude of what was tearing down the pipeline.


At an estimated cost of US$1.25 million a month—chump change for most developed countries’ intelligence services—the operation was a steal for the Russian government. (If you haven’t already done so, do read Special Counsel Robert Mueller’s 37-page indictmentof 13 Russian nationals and three Russian entities, including the Internet Research Agency).

While international media remains focused on Russian influence operations in the US and Europe—Sweden is the latest to prepare for possible election meddling—it’s important to note that covert cyber-enabled influence operations take place around the world, including in the Indo-Pacific.

In the Philippines, for example, media and academia have tracked how President Rodrigo Duterte’s ‘keyboard trolls’ spread and amplify messages in support of his policies through a combination of social media bots and fake accounts. Parts of this domestically focused operation appear to be coming straight from the president’s office. A 2017 Oxford paperclaimed that Duterte’s office had a budget of US$200,000 and employed 400–500 people to promote the president and defend him from online criticisms.

One operation that Australia’s national security community should watch closely is being investigated in Taiwan. Taipei’s District Prosecutors’ Office alleges that the Chinese government has been running a multi-year operation ‘aimed at infiltrating the military through obtaining confidential information from digital networks and databases, deepening existing contacts, holding military-related events and filing academic research reports’.

Apparently conducted through the Chinese government’s Taiwan Affairs Office (TAO), the operation involved financing pro-unification propaganda website FIRE News and then using the website’s Facebook page to recruit contacts, ideally senior military contacts. It’s been reported that TAO paid FIRE News’ administrators A$130 for every new Facebook like (as long as that user liked and read at least 70% of the page’s content). It offered A$220 for each Facebook user who interacted with the creators of the page at least once every two weeks for a minimum of two months. If offline meetings were secured with contacts (this had to be proved with photographs), A$435 was up for grabs.

A reward of A$2,180 was given if during these two-person exchanges the Facebook user opened up about their politics and personal feelings. If someone made it to this stage of the operation, it’s alleged that they were told to immediately get in touch with TAO for further instructions.

We could view this operation as stock-standard human intelligence collection—recruiting agents to recruit agents. But it’s actually a two-for-one hybrid operation. We have an Avon/Amway-style espionage operation fused with a cyber-enabled influence operation (taking place through both the website and attached Facebook page). Compared with Russia’s activities in this space, this operation was quieter (until it was discovered of course) and the approach appeared to be tilted towards long-term gain rather than short-term outcomes.

While this is one of the more interesting influence operations we are aware of in Asia, it’s only one case study we can learn from. Start scratching the surface of content farms—particularly in how they’re deployed against Taiwan—and we can glean insights into the types of information-warfare tactics being used in our region.

For Australia, it’s essential that we keep an eye on such influence operations occurring closer to home, particularly as we move towards our next national election. That’s a topic I’ll tackle in my next post.

No comments:

Post a Comment