Pages

14 June 2016

*** The Mistrust of Science

By Atul Gawande 
June 10, 2016

The following was delivered as the commencement address at the California Institute of Technology, on Friday, June 10th.
If this place has done its job—and I suspect it has—you’re all scientists now. Sorry, English and history graduates, even you are, too. Science is not a major or a career. It is a commitment to a systematic way of thinking, an allegiance to a way of building knowledge and explaining the universe through testing and factual observation. The thing is, that isn’t a normal way of thinking. It is unnatural and counterintuitive. It has to be learned. Scientific explanation stands in contrast to the wisdom of divinity and experience and common sense. Common sense once told us that the sun moves across the sky and that being out in the cold produced colds. But a scientific mind recognized that these intuitions were only hypotheses. They had to be tested.

When I came to college from my Ohio home town, the most intellectually unnerving thing I discovered was how wrong many of my assumptions were about how the world works—whether the natural or the human-made world. I looked to my professors and fellow-students to supply my replacement ideas. Then I returned home with some of those ideas and told my parents everything they’d got wrong (which they just loved). But, even then, I was just replacing one set of received beliefs for another. It took me a long time to recognize the particular mind-set that scientists have. The great physicist Edwin Hubble, speaking at Caltech’s commencement in 1938, said a scientist has “a healthy skepticism, suspended judgement, and disciplined imagination”—not only about other people’s ideas but also about his or her own. The scientist has an experimental mind, not a litigious one.



As a student, this seemed to me more than a way of thinking. It was a way of being—a weird way of being. You are supposed to have skepticism and imagination, but not too much. You are supposed to suspend judgment, yet exercise it. Ultimately, you hope to observe the world with an open mind, gathering facts and testing your predictions and expectations against them. Then you make up your mind and either affirm or reject the ideas at hand. But you also hope to accept that nothing is ever completely settled, that all knowledge is just probable knowledge. A contradictory piece of evidence can always emerge. Hubble said it best when he said, “The scientist explains the world by successive approximations.”

The scientific orientation has proved immensely powerful. It has allowed us to nearly double our lifespan during the past century, to increase our global abundance, and to deepen our understanding of the nature of the universe. Yet scientific knowledge is not necessarily trusted. Partly, that’s because it is incomplete. But even where the knowledge provided by science is overwhelming, people often resist it—sometimes outright deny it. Many people continue to believe, for instance, despite massive evidence to the contrary, that childhood vaccines cause autism (they do not); that people are safer owning a gun (they are not); that genetically modified crops are harmful (on balance, they have been beneficial); that climate change is not happening (it is).

Vaccine fears, for example, have persisted despite decades of research showing them to be unfounded. Some twenty-five years ago, a statistical analysis suggested a possible association between autism and thimerosal, a preservative used in vaccines to prevent bacterial contamination. The analysis turned out to be flawed, but fears took hold. Scientists then carried out hundreds of studies, and found no link. Still, fears persisted. Countries removed the preservative but experienced no reduction in autism—yet fears grew. A British study claimed a connection between the onset of autism in eight children and the timing of their vaccinations for measles, mumps, and rubella. That paper was retracted due to findings of fraud: the lead author had falsified and misrepresented the data on the children. Repeated efforts to confirm the findings were unsuccessful. Nonetheless, vaccine rates plunged, leading to outbreaks of measles and mumps that, last year, sickened tens of thousands of children across the U.S., Canada, and Europe, and resulted in deaths.

People are prone to resist scientific claims when they clash with intuitive beliefs. They don’t see measles or mumps around anymore. They do see children with autism. And they see a mom who says, “My child was perfectly fine until he got a vaccine and became autistic.”

Now, you can tell them that correlation is not causation. You can say that children get a vaccine every two to three months for the first couple years of their life, so the onset of any illness is bound to follow vaccination for many kids. You can say that the science shows no connection. But once an idea has got embedded and become widespread, it becomes very difficult to dig it out of people’s brains—especially when they do not trust scientific authorities. And we are experiencing a significant decline in trust in scientific authorities.

The sociologist Gordon Gauchat studied U.S. survey data from 1974 to 2010 and found some deeply alarming trends. Despite increasing education levels, the public’s trust in the scientific community has been decreasing. This is particularly true among conservatives, even educated conservatives. In 1974, conservatives with college degrees had the highest level of trust in science and the scientific community. Today, they have the lowest.

Today, we have multiple factions putting themselves forward as what Gauchat describes as their own cultural domains, “generating their own knowledge base that is often in conflict with the cultural authority of the scientific community.” Some are religious groups (challenging evolution, for instance). Some are industry groups (as with climate skepticism). Others tilt more to the left (such as those that reject the medical establishment). As varied as these groups are, they are all alike in one way. They all harbor sacred beliefs that they do not consider open to question.

To defend those beliefs, few dismiss the authority of science. They dismiss the authority of the scientific community. People don’t argue back by claiming divine authority anymore. They argue back by claiming to have the truer scientific authority. It can make matters incredibly confusing. You have to be able to recognize the difference between claims of science and those of pseudoscience.

Science’s defenders have identified five hallmark moves of pseudoscientists. They argue that the scientific consensus emerges from a conspiracy to suppress dissenting views. They produce fake experts, who have views contrary to established knowledge but do not actually have a credible scientific track record. They cherry-pick the data and papers that challenge the dominant view as a means of discrediting an entire field. They deploy false analogies and other logical fallacies. And they set impossible expectations of research: when scientists produce one level of certainty, the pseudoscientists insist they achieve another.

It’s not that some of these approaches never provide valid arguments. Sometimes an analogy is useful, or higher levels of certainty are required. But when you see several or all of these tactics deployed, you know that you’re not dealing with a scientific claim anymore. Pseudoscience is the form of science without the substance.

The challenge of what to do about this—how to defend science as a more valid approach to explaining the world—has actually been addressed by science itself. Scientists have done experiments. In 2011, two Australian researchers compiled many of the findings in “The Debunking Handbook.” The results are sobering. The evidence is that rebutting bad science doesn’t work; in fact, it commonly backfires. Describing facts that contradict an unscientific belief actually spreads familiarity with the belief and strengthens the conviction of believers. That’s just the way the brain operates; misinformation sticks, in part because it gets incorporated into a person’s mental model of how the world works. Stripping out the misinformation therefore fails, because it threatens to leave a painful gap in that mental model—or no model at all.

So, then, what is a science believer to do? Is the future just an unending battle of warring claims? Not necessarily. Emerging from the findings was also evidence that suggested how you might build trust in science. Rebutting bad science may not be effective, but asserting the true facts of good science is. And including the narrative that explains them is even better. You don’t focus on what’s wrong with the vaccine myths, for instance. Instead, you point out: giving children vaccines has proved far safer than not. How do we know? Because of a massive body of evidence, including the fact that we’ve tried the alternate experiment before. Between 1989 and 1991, vaccination among poor urban children in the U.S. dropped. And the result was fifty-five thousand cases of measles and a hundred and twenty-three deaths.

The other important thing is to expose the bad science tactics that are being used to mislead people. Bad science has a pattern, and helping people recognize the pattern arms them to come to more scientific beliefs themselves. Having a scientific understanding of the world is fundamentally about how you judge which information to trust. It doesn’t mean poring through the evidence on every question yourself. You can’t. Knowledge has become too vast and complex for any one person, scientist or otherwise, to convincingly master more than corners of it.

Few working scientists can give a ground-up explanation of the phenomenon they study; they rely on information and techniques borrowed from other scientists. Knowledge and the virtues of the scientific orientation live far more in the community than the individual. When we talk of a “scientific community,” we are pointing to something critical: that advanced science is a social enterprise, characterized by an intricate division of cognitive labor. Individual scientists, no less than the quacks, can be famously bull-headed, overly enamored of pet theories, dismissive of new evidence, and heedless of their fallibility. (Hence Max Planck’s observation that science advances one funeral at a time.) But as a community endeavor, it is beautifully self-correcting.

Beautifully organized, however, it is not. Seen up close, the scientific community—with its muddled peer-review process, badly written journal articles, subtly contemptuous letters to the editor, overtly contemptuous subreddit threads, and pompous pronouncements of the academy— looks like a rickety vehicle for getting to truth. Yet the hive mind swarms ever forward. It now advances knowledge in almost every realm of existence—even the humanities, where neuroscience and computerization are shaping understanding of everything from free will to how art and literature have evolved over time.

Today, you become part of the scientific community, arguably the most powerful collective enterprise in human history. In doing so, you also inherit a role in explaining it and helping it reclaim territory of trust at a time when that territory has been shrinking. In my clinic and my work in public health, I regularly encounter people who are deeply skeptical of even the most basic knowledge established by what journalists label “mainstream” science (as if the other thing is anything like science)—whether it’s facts about physiology, nutrition, disease, medicines, you name it. The doubting is usually among my most, not least, educated patients. Education may expose people to science, but it has a countervailing effect as well, leading people to be more individualistic and ideological.

The mistake, then, is to believe that the educational credentials you get today give you any special authority on truth. What you have gained is far more important: an understanding of what real truth-seeking looks like. It is the effort not of a single person but of a group of people—the bigger the better—pursuing ideas with curiosity, inquisitiveness, openness, and discipline. As scientists, in other words.

Even more than what you think, how you think matters. The stakes for understanding this could not be higher than they are today, because we are not just battling for what it means to be scientists. We are battling for what it means to be citizens.


Atul Gawande, a surgeon and public-health researcher, became a New Yorker staff writer in 1998.

No comments:

Post a Comment