Pages

16 December 2021

On the Report of the Aspen Commission on Information Disorder

Herb Lin

On Nov. 15, the Aspen Institute released a report underscoring the dangers of information disorder and making a number of recommendations to global leaders to address that issue. The report was authored by the three co-chairs of the Aspen Institute’s Commission on Information Disorder (Katie Couric, Chris Krebs and Rashad Robinson). Commissioners, of which I was privileged to be one, contributed to and broadly aligned with the report, though they were not required to fully endorse every recommendation and insight contained in the final version. This post, which draws extensively on the text of the report and on my supplementary statement, offers some further personal reflections.

The report should be required reading for anyone concerned with the present state of societal discourse, especially in the United States. It uses the term “information disorder” to denote the broad societal challenges associated with misinformation and disinformation. In the report, disinformation is defined as false or misleading information, intentionally created or strategically amplified to mislead for a purpose (for example, political, financial or social gain), whereas misinformation is false or misleading information that is not necessarily spread with an awareness that it is false or misleading.

It is painfully obvious that mis- and disinformation are common in much societal discourse today, and the report notes that when bad information becomes as prevalent, persuasive and persistent as good information, it creates a chain reaction of harm. Featured prominently as the report’s opening statement is the idea that information disorder is a crisis that exacerbates all other crises. In this observation, the report echoes and reinforces other bodies that have come to similar conclusions, such as the Bulletin of the Atomic Scientists. I too have called cyber-enabled information warfare an existential threat to modern civilization as we know it.

Particularly important for me is the commission’s willingness to acknowledge a broad swath of communities across the political spectrum that have been harmed by information disorder. Some have been the targets of racialized or bigoted disinformation. For these communities, an opportunistic and divisive politicization of critical race theory has led to an effort to further minimize already-limited historical discussions of slavery, Jim Crow, Reconstruction, Native American oppression and genocide, as well as other recognized nadirs in American history, and to replace such discussions with sanitized renderings that diminish awareness of racism and other injustices. Lack of general awareness of both historical mistreatment (such as the Tuskegee medical abuse) and the continuing, deeply embedded inequities rightly leaves many persons of color feeling unheard and unseen as well as being targets of racially motivated attacks.

But the report also acknowledges the lived experiences of those in other marginalized communities—including, for example, rural and working-class white Americans. Their lived experiences (for instance, losing their job security and their own sense of cultural identity as Americans) have made them feel that their concerns have not been heard or seen by either of the traditional political parties. Such inattention has led these communities to become suspicious of “authorities” and more distrusting, believing that mainstream media is full of “fake news.” Thus, they are increasingly easy targets for mis- and disinformation and political manipulation; many believe, in all sincerity, that the 2020 presidential election was stolen by the Democrats, that Donald Trump received more votes than Joe Biden, that the coronavirus pandemic is a hoax or that the coronavirus vaccine is useless, and that human-induced climate change is not real. The factual basis for such beliefs is nonexistent, but they are nevertheless strongly held. Not coincidentally, those who hold these beliefs often suffer materially from them—for example, coronavirus-related deaths are predominantly amongst the unvaccinated.

These two groups are not equivalent, nor have they been affected by information disorder in the same ways. One group is struggling to have its fact-based truths acknowledged by society at large and has borne the weight of oppression for hundreds of years, while the other struggles because its beliefs are grounded in the disinformation and lies that superficially explain the erosion of its historically dominant position in society. Despite this dissimilarity, the chasms between the groups render the population as a whole incapable of building trust, as there is no common ground from which to operate. In turn, common ground can be found only by drilling down to authentic individual experiences as opposed to ideological interpretations of generalized causation.

Finding common ground will require the identification of cognitive, emotional and epistemological paths for each group, leading to shared values that are perceived in others despite conflicting implementations and a shared factual basis for engagement and conversation. It will entail restorative efforts that combat the societally destructive effects of mis- and disinformation once released and widely spread into all corners of the information environment. And at the very least, it will require that every party believes that others have heard and understood their concerns. Bridging these chasms will not be possible without effective outreach to both groups—and that must start with recognizing their very existence.

The report is also significant in that it explicitly connects many aspects of information disorder to what happens in the physical, offline world. Today, the online world is the natural home for information. But mis- and disinformation had deep and profound effects in the offline world of human history long before the birth of the World Wide Web, before the emergence of personal computing and before the internet. From the Trojan horse to slavery to the propaganda of the Third Reich, misleading claims and outright lies have led people to say and do foolish and evil and wicked things in the physical world. Indeed, as the report points out, one of the most challenging aspects of addressing information disorder is confronting the reality that mis- and disinformation are often about giving recipients of such information permission to believe things they were already predisposed to believe, resulting in what some observers have called a demand for deceit.

What is the root of such predispositions and beliefs? The report hints at a psychological basis, noting that digital experiences exploit cognitive biases, but in my view the report should have paid more explicit attention to a burgeoning scientific literature on the connections between human psychology and people’s receptiveness to mis- and disinformation. Concepts such as dual-system cognition, heuristic versus analytical thought, and finite cognitive capacity undergird the foundation of behavioral economics, whose insights are becoming increasingly important in understanding the real world of how limited resources are distributed. Witting and unwitting spreaders of mis- and disinformation take advantage of these characteristics of human psychology, which increases the believability and acceptance of such information. These psychological phenomena mean that people are less able to exercise critical thinking in the appropriate contexts and situations, which means that once exposed to mis- and disinformation, an individual is less likely to reject it as being false or misleading.

Against this backdrop of finite cognitive capacity in humans, the significance of modern information technologies becomes much clearer. Social media, customized search engines, and ubiquitously available mobile computing and communications devices provide ready channels for the spread of mis- and disinformation; modern information technology enables the rapid and widespread dissemination of such information to millions of receptive and willing recipients.

In short, psychology and technology, when taken together, make the problem of finding common ground much worse and more difficult to address than would otherwise be the case. Nevertheless, finding common ground is necessary for national reconciliation.

Because the two groups have been negatively affected by information disorder in very different ways, the nature of the effort to find common ground will necessarily be different. For those whose fact-based truths have been generally unacknowledged, the commission offered a number of strong recommendations—with which I fully concur—to build trust and to reduce harms that focus on stemming the flow of mis- and disinformation to the public and helping resist the effects of such exposure. Actions in the offline world to redress historical wrongs are also likely to be necessary, though they are beyond the scope of the commission’s report.

But there are also large numbers of people in the second group who have already been significantly exposed to mis- and disinformation and now mistakenly believe in its truth. Communicating with such groups in meaningful ways—what I called “restorative efforts” above—that reduce the intensity of people’s belief in misinformation remains a large and substantial challenge to be overcome.

To the best of my knowledge, little systematic research has been done to investigate how best to reach across this gap. To date, nearly all of the available evidence, though anecdotal, points to the need for acknowledging the starting points of such individuals, understanding “where they are coming from” and respectfully listening to their concerns. Such common ground, if it can be found, provides a point of departure for mutual dialogue that has a chance of changing minds.

But these interactions are (almost) necessarily conducted on a retail basis—one-on-one rather than en masse. Moreover, few people have either the skill or the patience to engage in such a manner, despite the potentially large payoff. As it happens, some people are good at finding common ground and engaging with “the other side” with compassion and understanding, or at least they are much better at it than the rest of us (and I wish I were one of the select few). The question remains, then, what can we take away from these gifted communicators? Further, how can the application of “reach-across-the-divide” techniques be scaled up to happen on a wholesale rather than a retail basis? For example, are there less labor-intensive techniques that can embed such an approach and enable outreach to larger numbers of people?

Finding more effective ways of communicating across the divide is one necessary part of any long-term solution to the problem of information disorder. The report’s recommendation on “healthy digital discourse” addresses one aspect of a solution—developing tools and venues to support dialogue. But I do wish that the commission had also addressed an equally important aspect of the solution: the scope and nature of the dialogue that these tools and venues would support.

The above comments notwithstanding, I am pleased to align myself with the general thrust of the commission’s report, both for how it sets an appropriate context for its recommendations and for the content of the recommendations themselves. The commission completed its work in about six months, a remarkably short time in which to produce a report of such substance, and it had to make hard choices about what to address in that time. Perhaps the above comments, which build on the commission’s work, can serve as a jumping-off point for further exploration.

No comments:

Post a Comment