Pages

20 August 2024

The Big Picture


We all know the trope: a machine grows so intelligent that its apparent consciousness becomes indistinguishable from our own, and then it surpasses us – and possibly even turns against us. As investment pours into efforts to make such technology – so-called artificial general intelligence (AGI) – a reality, how scared of such scenarios should we be?

According to MIT’s Daron Acemoglu, the focus on “catastrophic risks due to AGI” is excessive and misguided, because it “(unhelpfully) anthropomorphizes AI” and “leads us to focus on the wrong targets.” A more productive discussion would focus on the factors that will determine whether AI is used for good or bad: “who controls [the technology], what their objectives are, and what kind of regulations they are subjected to.”

Joseph S. Nye, Jr., agrees that, whatever might happen with AGI in the future, the “growing risks from today’s narrow AI,” such as autonomous weapons and new forms of biological warfare, “already demand greater attention.” China, he points out, is already betting big on an “AI arms race,” seeking to benefit from “structural advantages” such as the relative lack of “legal or privacy limits on access to data” for training models.

As Oscar-winning filmmaker and tech investor Charles Ferguson explains, China now “dominates world markets for mass-produced dual-use hardware such as drones and robots.” And while the US, Western Europe, Taiwan, and South Korea still lead China (and Russia) in most of the technologies comprising the “stack” that underpins AI-driven products, their “lead is narrowing.” Given the slow pace of policy debates and legislative processes – “not to mention the product cycles of the Pentagon and legacy defense contractors” – they may soon fall behind.

Another area where China is advancing fast is surveillance technology, such as facial-recognition AI. As MIT’s Martin Beraja, Harvard’s David Y. Yang, and the University of Oxford’s Noam Yuchtman found in a recent study, “autocracies and weak democracies” are lining up to buy what China is selling. Worryingly, they are particularly likely to do this in years when they experience domestic unrest, and these countries appear to be “less likely to develop into mature democracies than peer countries with low imports of surveillance AI.” As with other goods that generate negative externalities, “tighter AI trade regulation” is in order.

No comments:

Post a Comment