Ben Smith
On Friday afternoons this fall, top American news executives have dialed into a series of off-the-record Zoom meetings led by Harvard academics whose goal is to “help newsroom leaders fight misinformation and media manipulation.”
Those are hot topics in the news industry right now, and so the program at Harvard University’s Shorenstein Center on Media, Politics and Public Policy drew an impressive roster of executives at CNN, NBC News, The Associated Press, Axios and other major U.S. outlets.
A couple of them, though, told me they were puzzled by the reading package for the first session.
It consisted of a Harvard case study, which a participant shared with me, examining the coverage of Hunter Biden’s lost laptop in the final days of the 2020 campaign. The story had been pushed by aides and allies of then-President Donald J. Trump who tried to persuade journalists that the hard drive’s contents would reveal the corruption of the father.
The news media’s handling of that narrative provides “an instructive case study on the power of social media and news organizations to mitigate media manipulation campaigns,” according to the Shorenstein Center summary.
The Hunter Biden laptop saga sure is instructive about something. As you may recall, panicked Trump allies frantically dumped its contents onto the internet and into reporters’ inboxes, a trove that apparently included embarrassing images and emails purportedly from the candidate’s son showing that he had tried to trade on the family name. The big social media platforms, primed for a repeat of the WikiLeaks 2016 election shenanigans, reacted forcefully: Twitter blocked links to a New York Post story that tied Joe Biden to the emails without strong evidence (though Twitter quickly reversed that decision) and Facebook limited the spread of the Post story under its own “misinformation” policy.
But as it now appears, the story about the laptop was an old-fashioned, politically motivated dirty tricks campaign, and describing it with the word “misinformation” doesn’t add much to our understanding of what happened. While some of the emails purportedly on the laptop have since been called genuine by at least one recipient, the younger Mr. Biden has said he doesn’t know if the laptop in question was his. And the “media manipulation campaign” was a threadbare, 11th-hour effort to produce a late-campaign scandal, an attempt at an October Surprise that has been part of nearly every presidential campaign I’ve covered.
The Wall Street Journal, as I reported at the time, looked hard at the story. Unable to prove that Joe Biden had tried, as vice president, to change U.S. policy to enrich a family member, The Journal refused to tell it the way the Trump aides wanted, leaving that spin to the right-wing tabloids. What remained was a murky situation that is hard to call “misinformation,” even if some journalists and academics like the clarity of that label. The Journal’s role was, in fact, a pretty standard journalistic exercise, a blend of fact-finding and the sort of news judgment that has fallen a bit out of favor as journalists have found themselves chasing social media.
While some academics use the term carefully, “misinformation” in the case of the lost laptop was more or less synonymous with “material passed along by Trump aides.” And in that context, the phrase “media manipulation” refers to any attempt to shape news coverage by people whose politics you dislike. (Emily Dreyfuss, a fellow at the Technology and Social Change Project at the Shorenstein Center, told me that “media manipulation,” despite its sinister ring, is “not necessarily nefarious.”)
The focus on who’s saying something, and how they’re spreading their claims, can pretty quickly lead Silicon Valley engineers to slap the “misinformation” label on something that is, in plainer English, true.
Shorenstein’s research director, Joan Donovan, who is leading the program and raised its funding from the John S. and James L. Knight Foundation, said that the Hunter Biden case study was “designed to cause conversation — it’s not supposed to leave you resolved as a reader.”
Ms. Donovan, a force on Twitter and a longtime student of the shadiest corners of the internet, said she defines “misinformation” as “false information that’s being spread.” She strongly objected to my suggestion that the term lacks a precise meaning.
She added that, appearances aside, she doesn’t believe the word is merely a left-wing label for things that Democrats don’t like. Instead, she traces the modern practice of “disinformation” (that is, deliberate misinformation) to the anti-corporate activists the Yes Men, famous for hoaxed corporate announcements and other stunts, and the “culture jamming” of Adbusters. But their tools, she wrote, have been adopted by “foreign operatives, partisan pundits, white supremacists, violent misogynists, grifters and scammers.”
Ms. Donovan is among the scholars who have tried to unravel the knotty information tangle of contemporary politics. She’s currently a compulsive consumer of Steve Bannon’s influential podcast, “War Room.” Like many of the journalists and academics who study our chaotic media environment, she has zeroed in on the way that trolls and pranksters developed tactics for angering and tricking people online over the first half of the last decade, and how those people brought their tactics to the right-wing reactionary politics in the decade’s second half.
To the people paying close attention, this new world was riveting and dangerous — and it was maddening that outsiders couldn’t see what was happening. For these information scholars, widespread media manipulation seemed like the main event of recent years, the main driver of millions of people’s beliefs, and the main reason Mr. Trump and people like him won elections all over the world. But this perspective, while sometimes revelatory, may leave little space for other causes of political action, or for other types of political lies, like the U.S. government’s long deception on its progress in the war in Afghanistan.
What had been a niche preoccupation has now been adopted by people who have spent somewhat less time on 4chan than Ms. Donovan. The broadcaster Katie Couric recently led the Aspen Institute’s Commission on Information Disorder. I moderated a panel at Bloomberg’s New Economy Forum with a different, somewhat dental, label for the same set of issues, “truth decay.” (The RAND Corporation seems to have coined that one, though T Bone Burnett did release an album by that name in 1980.) There, an Australian senator, Sarah Hanson-Young, said she thought the biggest culprit in misleading her fellow citizens about climate change had been Rupert Murdoch’s News Corp — hardly a new issue, or one that needs a new name. The New York Post’s insistence that the emails prove President Biden’s corruption, and not just his son’s influence peddling, are part of the same partisan genre.
This hints at a weakness of the new focus on misinformation: It’s a technocratic solution to a problem that’s as much about politics as technology. The new social media-fueled right-wing populists lie a lot, and stretch the truth more. But as American reporters quizzing Donald Trump’s fans on camera discovered, his audience was often in on the joke. And many of the most offensive things he said weren’t necessarily lies — they were just deeply ugly to half the country, including most of the people running news organizations and universities.
It’s more comfortable to reckon with an information crisis — if there’s anything we’re good at, it’s information — than a political one. If only responsible journalists and technologists could explain how misguided Mr. Trump’s statements were, surely the citizenry would come around. But these well-meaning communications experts never quite understood that the people who liked him knew what was going on, laughed about it and voted for him despite, or perhaps even because of, the times he went “too far.”
Harper’s Magazine recently published a broadside against “Big Disinfo,” contending that the think tanks raising money to focus on the topic were offering a simple solution to a political crisis that defies easy explanation and exaggerating the power of Facebook in a way that, ultimately, served Facebook most of all. The author, Joseph Bernstein, argued that the journalists and academics who specialize in exposing instances of disinformation seem to believe they have a particular claim on truth. “However well-intentioned these professionals are, they don’t have special access to the fabric of reality,” he wrote.
In fact, I’ve found many of the people worrying about our information diets are reassuringly modest about how far the new field of misinformation studies is going to take us. Ms. Donovan calls it “a new field of data journalism,” but said she agreed that “this part of the field needs to get better at figuring out what’s true or false.” The Aspen report acknowledged “that in a free society there are no ‘arbiters of truth.’” They’re putting healthy new pressure on tech platforms to be transparent in how claims — true and false — spread.
The editor in chief of The Texas Tribune, Sewell Chan, one of the Harvard course’s participants, said he didn’t think the program had a political slant, adding that it “helped me understand the new forms of mischief making and lie peddling that have emerged.”
“That said, like the term ‘fake news,’ misinformation is a loaded and somewhat subjective term,” he said. “I’m more comfortable with precise descriptions.”
I also feel the push and pull of the information ecosystem in my own journalism, as well as the temptation to evaluate a claim by its formal qualities — who is saying it and why — rather than its substance. Last April, for instance, I tweeted about what I saw as the sneaky way that anti-China Republicans around Donald Trump were pushing the idea that Covid-19 had leaked from a lab. There were informational red flags galore. But media criticism (and I’m sorry you’ve gotten this far into a media column to read this) is skin-deep. Below the partisan shouting match was a more interesting scientific shouting match (which also made liberal use of the word “misinformation”). And the state of that story now is that scientists’ understanding of the origins of Covid-19 is evolving and hotly debated, and we’re not going to be able to resolve it on Twitter.
The story of tech platforms helping to spread falsehoods is still incredibly important, as is the work of identifying stealthy social media campaigns from Washington to, as my colleague Davey Alba recently reported, Nairobi. And the Covid-19 pandemic also gave everyone from Mark Zuckerberg to my colleagues at The New York Times a new sense of urgency about, for instance, communicating the seriousness of the pandemic and the safety of vaccines in a media landscape littered with false reports.
But politics isn’t a science. We don’t need to mystify the old-fashioned practice of news judgment with a new terminology. There’s a danger in adopting jargony new frameworks we haven’t really thought through. The job of reporters isn’t, ultimately, to put neat labels on the news. It’s to report out what’s actually happening, as messy and unsatisfying as that can be.
No comments:
Post a Comment