“Fake news”, “disinformation”, “misinformation”; since the 2016 US presidential elections, these terms have exploded into the public lexicon, bringing awareness to a longstanding problem of online communication: when anyone can publish anything, how do we know anything is true?
Focusing on the main types of audiences for disinformation is perhaps the cleanest way to approach solutions to the problem of disinformation. There are three audience types for disinformation: those who’ll buy the message, those who may not buy the message but can be temporarily convinced or confused by it, and those who refuse to believe. In short, they are the Convinced, the Confused, and the Skeptics.
Any solution would need to target the Convinced and the Confused. (The Skeptics are not buying the message, and therefore not a cause for concern.) For both the Convinced and the Confused, most of the problem boils down to a lack of media savvy. This lack of savvy includes an inability to recognize the hallmarks for disinformation pieces, such as sensationalized headlines and lack of sources, and lacking the instinct to fact-check before reacting.
The most obvious solution is to educate unsavvy media consumers and thereby hopefully push them into the skepticism. Some Non-Governmental Organizations (NGOs) like IREX have reported success in sensitizing Eastern European populations to Russian influence operations which rely heavily on disinformation. The main challenge with such initiatives remains scaling them up to reach large audiences, for which government funding or intervention might be necessary.
Another solution is to police the online environment to reduce disinformation. However, in most liberal democracies, top-down control of content smacks too much of 1984 thought-police to be feasible. Private sector led policing is a more palatable option. However, companies’ track record thus far with online disinformation has been mixed. Even the relatively straightforward removing extremist propaganda remains something YouTube struggles with policing in a timely fashion. It is possible that regardless of improvements, the free-wheeling adaptive environment of the web is immune to comprehensive policing. For example, to prevent their pictures from being taken down, ISIS members have taken to hosting images on third-party sites and using Twitter to simply disseminate the links.
There is an additional dimension with the Convinced. Unlike the Confused, they exhibit a level of buy-in to the alternative messaging offered in disinformation which cannot be explained away as ignorance. The missing factor is usually a deeper underlying grievance that renders the message in a piece of disinformation attractive. Short of solving the underlying grievance, another way to engage the Convinced is by presenting an alternative counter-narrative to that in disinformation campaigns.
Such counter-messaging is difficult to get right. A counter-narrative already exists – mainstream ideas which the Convinced have already rejected in favor of that which disinformation offers. Messages therefore need to be tailored and delivered from a credible source to have any hope of changing a Convinced mind. Government efforts in countering violent extremism show counter-narrative building runs into three major problems: credibility, relevance, and volume. The State Department’s Think Again Turn Away online campaign to counter ISIS’ online presence perfectly encapsulates the first two. In addition to being from a non-credible source, the US Government, the messaging was too reliant on highlighting ISIS’ violence as a deterrent –violence was probably not a major issue to many ISIS-sympathizers as ISIS’ was already publicizing it themselves . Volume is the other challenge in countering disinformation. Fact-checking is naturally a time-consuming venture. Government-led initiatives are slower than the larger, more dedicated organizations like Russian troll farms. Hence disinformation campaigns will usually outpace government efforts to counter-message.
The common wisdom is that communities must do their own counter-messaging to ensure better relevance and credibility. However, an offshoot of the volume problem remains: competition for attention is tough. One possible solution which has shown some success is marrying online with offline efforts as seen in guerrilla marketing as seen with NGO, EXIT-Germany’s Trojan T-Shirt campaign. The NGO, which helps right-wing break from extremism, went undercover and distributed t-shirts at right-wing events which appeared to be espousing right-wing messages. When washed, the “trojan” t-shirts revealed a hidden message directing the user to reach out to EXIT-Germany. As a result, the number of right-wing extremists contacting Exit-Germany tripled. Targeted ad campaigns may also help. An experiment, The Redirect Method, found when ISIS counter-messaging advertisements were tailored to users’ behavior, the click-through rate doubled.
The bottom line is there is no magic bullet to combating online disinformation, and many of the strategies which can be employed have their own limitations. The silver lining though is that people have become pretty good at changing one another’s opinions over time. There is no reason those skills cannot be redeployed in service to a better cause.
Lucas, E., & Pomeransev, P. (August, 2016). Winning the information war: Techniques and counter-strategies to russian propaganda in central and eastern europe, CEPA’s Information Warfare Project in Partnership with the Legatum Institute.
Ibid.
Stalinsky, Steven, and R. Sosnow. “The Jihadi Cycle On Content-Sharing Web Services 2009-2016 And The Case Of Justpaste.it: Favored By ISIS, Al-Qaeda, And Other Jihadis For Posting Content And Sharing It On Twitter – Jihadis Move To Their Own Platforms (Manbar, Nashir, Alors.Ninja) But Then Return To Justpaste.it.” MEMRI. June 06, 2016. Accessed February 08, 2018. https://www.memri.org/reports/jihadi-cycle-content-sharing-web-services-2009-2016-and-case-justpasteit-favored-isis-al.
Meleagrou-Hitchens, Alexander. The Challenges and Limitations of Online Counter-Narratives in the Fight against ISIS Recruitment in Europe and North America. Georgetown Journal of International Affairs, Fall 2017, Volume 18, No. 3
Paul, Christopher and Miriam Matthews. The Russian “Firehose of Falsehood” Propaganda Model: Why It Might Work and Options to Counter It. Santa Monica, CA: RAND Corporation, 2016. https://www.rand.org/pubs/perspectives/PE198.html. Accessed 10 November 2017.
“What CVE Can Learn from Guerrilla Marketing.” Lawfare. October 17, 2017. Accessed February 07, 2018. https://www.lawfareblog.com/what-cve-can-learn-guerrilla-marketing.
Dafnos, Andreas. Narratives as a Means of Countering the Radical Right; Looking into the Trojan T-shirt Project. Journal EXIT-Deutschland. 2014, Volume 3
“The Redirect Method.” The Pilot. Accessed February 08, 2018. https://redirectmethod.org/pilot/.
georgetownsecuritystudiesreview.org · by GSSR · February 14, 2018
No comments:
Post a Comment