BY PATRICK TUCKER
Lawmakers and leaders want to fight foreign influence operations, but they aren’t asking the right questions.
U.S. lawmakers are finally getting serious about disinformation from nation states as a national security concern. The 2017 National Defense Authorization Act directs the State Department’s Global Engagement Center, working with the Defense Department and other agencies to “counter foreign propaganda and disinformation directed against United States national security interests and proactively advance fact-based narratives that support United States allies and interests.”
The provision borrows heavily from the Countering Foreign Propaganda and Disinformation Act, whose author, Sen. Rob Portman, R-Ohio, has grown increasingly concerned about foreign influence operations. A staffer for Portman cited Russia’s 2014 annexation of Crimea.
“Russia launched a coordinated information campaign to undermine and delegitimize the post-Maidan Ukrainian government and subvert its authority over the peninsula, while obscuring the role of the Russian government in these efforts to the rest of the world,” the staffer said. “By the time Western nations finally agreed on what was happening, Russia had solidified its hold over the territory such that it was too late to do much about it.”
China’s construction of island military bases in the South China Sea is another good example, the staffer said.
The new center will have $160 million for its efforts over the next two years — but even that kind of money can’t ease the difficult and controversial choices ahead.
Knowing Your Audience
Rand Waltzman, a former program manager at the Defense Advanced Research Projects Agency, or DARPA, was fighting foreign propaganda before it was cool. At DARPA, he led an unclassified program called Social Media In Strategic Communications, or SMIC, which spent about $50 million to study the spread of ideas and influence, everything from how online videos affect emotions to how Twitterbots accelerate the spread of fake news. When the program ended in 2015, Waltzman began speaking out, before the House Armed Services Committee and elsewhere. His message: the United States was constraining itself unnecessarily, and perhaps fatally, in its response to Russia, ISIS, and other enemies.
“An important question regarding the survival of our nation is how we answer the increasing threats that we face in the Information Environment from adversaries who range from nation states large and small to criminal or terrorist organizations to a handful of people with malicious intent,” he wrote in a newsletter for the conservative American Foreign Policy Council. “At this time, all of our adversaries possess a significant asymmetric advantage over us as a result of policy; legal and organizational constraints that we are subject to and they are not. We need honest and open debate about how to meet these threats.”
To combat this “cognitive hacking,” he proposed a Center for Information Environment Security to serve as a “network of networks of human efforts, technology development and technology transfer” related to fighting foreign influence operations.
Waltzman believes the key is applying intelligence and reconnaissance practices online against the same adversaries we were looking at offline. The first recommendation: “Understand the adversary’s targeted audiences and information dissemination networks including what channels they use and what messages seem to resonate with their targeted audiences. Do this by, for example, monitoring Twitter and other public commentary. Identify adversary operatives and fanboys (amplifiers) in real time and measure shifting keywords, emerging content, and instructions to meet on other platforms.”
Indeed, that’s very much how the State and Defense departments are targeting messaging from ISIS, if not yet from actual nation states. State’s Global Engagement Center reaches out to partners: nation states, NGOs, even individuals. The center also uses social network data to identify groups and people who might be particularly susceptible to such messaging.
Those partners, or the Center, then send them targeted messages, says Michael Lumpkin, who leads the Center.
“Using Facebook ads, I can go within Facebook, I can go grab an audience, I can pick Country X, I need age group 13 to 34, I need people who have liked — whether it’s Abu Bakr Al Baghdadi or any other set — I can shoot and hit them directly with messaging for … in some places, pennies a click.”
Here is Lumpkin discussing the Center at the Defense One Summit last month:
But fighting foreign influence in the United States is a lot trickier. US Law 50 U.S. Code § 3093(f) effectively prohibits the government from action “intended to influence United States political processes, public opinion, policies, or media,” while the 1974 Privacy Act prohibits the unfettered dissemination of personally identifiable information.
Lumpkin says the law leaves too many questions unanswered.
“The 1974 act was created to make sure that we aren’t collecting data on U.S. citizens. Well, … by definition the World Wide Web is worldwide. There is no passport that goes with it. If it’s a Tunisian citizen in the United States or a U.S. citizen in Tunisia, I don’t have the ability to discern that. Therefore, I have trouble grabbing that personally identifiable information,” he said. “If I had more ability to work with that [personally identifiable information] and had access…I could do more targeting, more definitively, to make sure I could hit the right message to the right audience at the right time.”
Waltzman says that the government takes both laws too literally.
“The interpretations are extremely conservative, to the point where we aren’t able to function. It’s a combination of interpretation, existing laws and policies which are based on those, and attitude,” he said.
But a key question remains: can “fact-based narratives” counter foreign propaganda and influence operations? Consider the emails stolen from the Democratic National Committee and Clinton aide John Podesta. U.S. intelligence agencies have concluded the theft of the emails and their later publication was part of a concerted Russian influence operation. But it was not a misinformation campaign, per se, because the emails in question were real and no evidence as emerged yet to suggest that they were tampered with.
The spread of false or misleading information via foreign, state-owned news outlets such as RT is closer to the sort of tactic the government can fight. But when U.S. outlets repeat such reports without verifying their accuracy, it’s bad journalistic practice, not war.
The United States has a different menu of appropriate responses depending on the tactic of the information operation against it. Each potential response carries high risk. For a foreign state running an influence operation, fake news, stolen emails, twitter and troll bots, etc. all represent complementary tools that can be deployed at low cost and low risk.
There’s a reason that false news, stolen emails, and scandal grab attention and influences behavior. It’s the same reason that jihadist videos work on disaffected people: namely, a strong emotional appeal. Marketers have known the secrets of emotional manipulation for years; now, the U.S. government and military are trying to catch up.
Gut Appeal
In a Nov. 9 meeting at the Pentagon, a variety of social science, cognitive science, and psychological warfare researchers assembled to discuss the latest trends in influence operations. A key theme emerged: when pitching a story to an audience, emotional appeals usually beat rational dialogue.
In a recording of the discussion, organizer Jason Spitaletta from the Johns Hopkins Applied Physics Lab told participants that much current practice in psychological operations is not validated by data.
“It’s this counterintuitive finding that the more we’re engaging in the message, the more we’re actually making it more effective for our adversary,” he said. “Countering it is … research would suggest, a very bad idea, yet it forms the basis of our current counter messaging policy.”
Spencer Gerrol, of the neuro-marketing consultancy SPARKNuero, agreed. “The primary thing that influences behaviors is, as Jason said, emotion. We have shown that with our metrics, when we see a strong emotional response, that strong emotional response correlates to changes in real world behaviors,” he said.
This is the reason why fewer people read fact-checking storiesabout fake news than fake news itself. The latter is tailor-made for groups with a connection to polarizing subjects such as immigration, Islam, Hillary Clinton, or terrorism. But people who are emotionally connected to fact-check stories are mostly journalists.
Here’s the main challenge for the government: developing an emotionally compelling story requires an understanding of the audience. What motivates them? What do they respond to? What issues are they emotionally connected to? This is information that we give away freely to marketers in our daily digital life, in our likes and our retweets, in the sites that we visit, information that they use to make persuasive pitches to us. But we are less happy to give it to the government even when our own physical security is at stake, much less the customer engagement score of some government-sponsored “fact-based narrative.”
If the United States really wants to fight foreign influence, what’s needed is a real discussion about how emotional triggering works, what foreign influence operations are, and how to fight them. Waltzman says the Pentagon is not in a good position to begin that discussion even internally, much less with the public.
“U.S. military doctrine is primarily kinetically oriented,” he said. “Our opponents are becoming less kinetically oriented by the day. If you look at our opponents, their concept of what makes up warfare is actually quite different. It’s much more all encompassing. It’s social; it’s economic. It’s not ‘get out the guns and start shooting, this is their weapon of last resort’…Guess what, its practice makes perfect. The more you practice, the more you develop [tactics, techniques and procedures], the better you are. They’re out there practicing in an unrestricted way and we’re not.”
No comments:
Post a Comment