Tiffany Hsu, Stuart A. Thompson and Steven Lee Myers
Billions of people will vote in major elections this year — around half of the global population, by some estimates — in one of the largest and most consequential democratic exercises in living memory. The results will affect how the world is run for decades to come.
At the same time, false narratives and conspiracy theories have evolved into an increasingly global menace.
Baseless claims of election fraud have battered trust in democracy. Foreign influence campaigns regularly target polarizing domestic challenges. Artificial intelligence has supercharged disinformation efforts and distorted perceptions of reality. All while major social media companies have scaled back their safeguards and downsized election teams.
“Almost every democracy is under stress, independent of technology,” said Darrell M. West, a senior fellow at the Brookings Institution think tank. “When you add disinformation on top of that, it just creates many opportunities for mischief.”
It is, he said, a “perfect storm of disinformation.”
The global calendar includes at least 83 elections, the largest concentration for at least the next 24 years, according to the consulting firm Anchor Change.
Those elections are spread around the world, including in Europe, where 27 member countries of the European Union will vote in its parliamentary election this June.
That amounts to more than four billion people, by some estimates.
January alone has at least seven elections. Taiwan, which is trying to ward off Chinese disinformation campaigns, votes for a new president on Jan. 13.
Pakistan and Indonesia — the most populous Muslim countries, which have both fought to balance freedom of speech with efforts to combat disinformation — hold elections a week apart in February.
In India, where the prime minister has warned about misleading A.I. content, general elections are scheduled for the spring.
Elections for the European Parliament will take place in June as the European Union continues to put into effect a new law meant to contain corrosive online content.
A presidential election in Mexico that same month could be affected by a feedback loop of false narratives from elsewhere in the Americas.
The United States, already in the thick of a presidential race marked by resurgent lies about voting fraud, goes to the polls in November.
National elections are also planned in places where democracy has struggled to take root. Russia and Ukraine, which both scheduled presidential elections, are issuing dueling narratives about their continuing war.
In Africa, one of the most critical elections on the continent will take place in South Africa, which has faced xenophobic disinformation campaigns in the past.
The stakes are enormous.
Democracy, which spread globally after the end of the Cold War, faces mounting challenges worldwide — from mass migration to climate disruption, from economic inequities to war. The struggle in many countries to respond adequately to such tests has eroded confidence in liberal, pluralistic societies, opening the door to appeals from populists and strongman leaders.
Autocratic countries, led by Russia and China, have seized on the currents of political discontent to push narratives undermining democratic governance and leadership, often by sponsoring disinformation campaigns. If those efforts succeed, the elections could accelerate the recent rise in authoritarian-minded leaders.
Fyodor A. Lukyanov, an analyst who leads a Kremlin-aligned think tank in Moscow, the Council on Foreign and Defense Policy, argued recently that 2024 “could be the year when the West’s liberal elites lose control of the world order.”
The political establishment in many nations, as well as intergovernmental organizations like the Group of 20, appears poised for upheaval, said Katie Harbath, founder of the technology policy firm Anchor Change and formerly a public policy director at Facebook managing elections. Disinformation — spread via social media but also through print, radio, television and word of mouth — risks destabilizing the political process.
“We’re going to hit 2025 and the world is going to look very different,” she said.
Aggressive State Operatives
Among the biggest sources of disinformation in elections campaigns are autocratic governments seeking to discredit democracy as a global model of governance.
Russia, China and Iran have all been cited in recent months by researchers and the U.S. government as likely to attempt influence operations to disrupt other countries’ elections, including this year’s U.S. presidential election. The countries see the coming year as “a real opportunity to embarrass us on the world stage, exploit social divisions and just undermine the democratic process,” said Brian Liston, an analyst at Recorded Future, a digital security company that recently reported on potential threats to American race.
The company also examined a Russian influence effort that Meta first identified last year, dubbed “Doppelgänger,” that seemed to impersonate international news organizations and created fake accounts to spread Russian propaganda in the United States and Europe. Doppelgänger appeared to have used widely available artificial intelligence tools to create news outlets dedicated to American politics, with names like Election Watch and My Pride.
Disinformation campaigns like this easily traverse borders.
Conspiracy theories — such as claims that the United States schemes with collaborators in various countries to engineer local power shifts or that it operates secret biological weapons factories in Ukraine — have sought to discredit American and European political and cultural influence around the world. They could appear in Urdu in Pakistan while also surfacing, with different characters and language, in Russia, shifting public opinion in those countries in favor of anti-West politicians.
The false narratives volleying around the world are often shared by diaspora communities or orchestrated by state-backed operatives. Experts predict that election fraud narratives will continue to evolve and reverberate, as they did in the United States and Brazil in 2022 and then in Argentina in 2023.
A Cycle of Polarization and Extremism
An increasingly polarized and combative political environment is breeding hate speech and misinformation, which pushes voters even further into silos. A motivated minority of extreme voices, aided by social media algorithms that reinforce users’ biases, is often drowning out a moderate majority.
“We are in the middle of redefining our societal norms about speech and how we hold people accountable for that speech, online and offline,” Ms. Harbath said. “There are a lot of different viewpoints on how to do that in this country, let alone around the globe.”
Some of the most extreme voices seek one another out on alternative social media platforms, like Telegram, BitChute and Truth Social. Calls to pre-emptively stop voter fraud — which historically is statistically insignificant — recently trended on such platforms, according to Pyrra, a company that monitors threats and misinformation.
The “prevalence and acceptance of these narratives is only gaining traction,” even directly influencing electoral policy and legislation, Pyrra found in a case study.
“These conspiracies are taking root amongst the political elite, who are using these narratives to win public favor while degrading the transparency, checks and balances of the very system they are meant to uphold,” the company’s researchers wrote.
A.I.’s Risk-Reward Proposition
Artificial intelligence “holds promise for democratic governance,” according to a report from the University of Chicago and Stanford University. Politically focused chatbots could inform constituents about key issues and better connect voters with elected officials.
The technology could also be a vector for disinformation. Fake A.I. images have already been used to spread conspiracy theories, such as the unfounded assertion that there is a global plot to replace white Europeans with nonwhite immigrants.
In October, Jocelyn Benson, Michigan’s secretary of state, wrote to Senator Chuck Schumer, Democrat of New York and the majority leader, saying that “A.I.-generated content may supercharge the believability of highly localized misinformation.”
“A handful of states — and particular precincts within those states — are likely to decide the presidency,” she said. “Those seeking to sway outcomes or sow chaos may enlist A.I. tools to mislead voters about wait times, closures or even violence at specific polling locations.”
Lawrence Norden, who runs the elections and government program at the Brennan Center for Justice, a public policy institute, added that A.I. could imitate large amounts of materials from election offices and spread them widely. Or, it could manufacture late-stage October surprises, like the audio with signs of A.I. intervention that was released during Slovakia’s tight election this fall.
“All of the things that have been threats to our democracy for some time are potentially made worse by A.I.,” Mr. Norden said while participating in an online panel in November. (During the event, organizers introduced an artificially manipulated version of Mr. Norden to underscore the technology’s abilities.)
Some experts worry that the mere presence of A.I. tools could weaken trust in information and enable political actors to dismiss real content. Others said fears, for now, are overblown. Artificial intelligence is “just one of many threats,” said James M. Lindsay, senior vice president at the Council on Foreign Relations think tank.
“I wouldn’t lose sight of all the old-fashioned ways of sowing misinformation or disinformation,” he said.
Big Tech Scales Back Protections
In countries with general elections planned for 2024, disinformation has become a major concern for a vast majority of people surveyed by UNESCO, the United Nation’s cultural organization. And yet efforts by social media companies to limit toxic content, which escalated after the American presidential election in 2016, have recently tapered off, if not reversed entirely.
Meta, YouTube and X, the platform formerly known as Twitter, downsized or reshaped the teams responsible for keeping dangerous or inaccurate material in check last year, according to a recent report by Free Press, an advocacy organization. Some are offering new features, like private one-way broadcasts, that are especially difficult to monitor.
The companies are starting the year with “little bandwidth, very little accountability in writing and billions of people around the world turning to these platforms for information” — not ideal for safeguarding democracy, said Nora Benavidez, the senior counsel at Free Press.
Newer platforms, such as TikTok, will very likely begin playing a larger role in political content. Substack, the newsletter start-up that last month said it would not ban Nazi symbols and extremist rhetoric from its platform, wants the 2024 voting season to be “the Substack Election.” Politicians are planning livestreamed events on Twitch, which is also hosting a debate between A.I.-generated versions of President Biden and former President Donald J. Trump.
Meta, which owns Facebook, Instagram and WhatsApp, said in a blog post in November that it was in a “strong position to protect the integrity of next year’s elections on our platforms.” (Last month, a company-appointed oversight board took issue with Meta’s automated tools and its handling of two videos related to the Israel-Hamas conflict.)
YouTube wrote last month that its “elections-focused teams have been working nonstop to make sure we have the right policies and systems in place.” The platform said this summer that it would stop removing false voter fraud narratives. (YouTube said it wanted voters to hear all sides of a debate, though it noted that “this isn’t a free pass to spread harmful misinformation or promote hateful rhetoric.”)
Such content proliferated on X after the billionaire Elon Musk took over in late 2022. Months later, Alexandra Popken left her role managing trust and safety for the platform. Many social media companies are leaning heavily on unreliable A.I.-powered content moderation tools, leaving stripped-down crews of humans in constant firefighting mode, said Ms. Popken, who later joined the content moderation company WebPurify.
“Election integrity is such a behemoth effort that you really need a proactive strategy, a lot of people and brains and war rooms,” she said.
No comments:
Post a Comment