Pages

11 March 2019

Lethal Autonomous Weapons Systems: Recent Developments

By Hayley Evans, Natalie Salmanowitz

On March 25-29, the U.N.’s Group of Governmental Experts (GGE) will meet for the third consecutive year to discuss developments and strategies in the field of lethal autonomous weapons systems (LAWS). As a subsidiary body of the Convention on Certain Conventional Weapons (CCW), the GGE brings together High Contracting Parties, state signatories, international organizations, nongovernmental organizations and academic bodies in an effort to define LAWS, debate best practices, and recommend steps to address the potential development and use of LAWS in the future. It’s been six months since the GGE last met, and this will be the first of two GGE meetings taking place in 2019 (for more information on the GGE’s prior meetings, see here and here). This post will cover all you need to know about where relevant stakeholders stand leading up to the March meeting.

Background on LAWS


As a general matter, LAWS are weapons that can select, detect and engage targets with little to no human intervention. Though there is no singularly accepted definition of LAWS, the term typically covers a broad array of potential weapons systems, rangingfrom fully autonomous weapons that can launch attacks without any human involvement to semi-autonomous weapons that require affirmative human action to execute a mission. Critics of LAWS focus primarily on fully autonomous weapons, dubbing LAWS “killer robots” and questioning their ability to respect human life and comply with international humanitarian law (IHL). Others, like the U.S. government, foresee potential advantages of the technology, arguing that LAWS’s automated targeting features might actually augment states’ abilities to meet IHL requirements through increased accuracy and efficiency. While it’s too soon to tell whether LAWS’s capabilities are a feature or a bug, the GGE’s ultimate decisions may have profound consequences for the development and use of LAWS.

Global Developments

Before reviewing the GGE’s and High Contracting Parties’ most recent meetings, it’s worth surveying the global pulse on attitudes toward LAWS and highlighting key developments in the public and private spheres.

To start, Human Rights Watch (HRW) and the Campaign to Stop Killer Robots (CSKR)—two of the chief proponents of a preemptive LAWS ban—have kept busy on the advocacy front. In August 2018, HRW published a report in conjunction with Harvard Law School’s International Human Rights Clinic (IHRC) entitled, “Heed the Call: A Moral and Legal Imperative to Ban Killer Robots.” As in its earlier reports—see hereand here—HRW called for a preemptive ban on the development, production and use of LAWS. But this new report went one step further, arguing that fully autonomous weapons would contravene the Martens Clause, which was introduced into the preamble to the 1899 Hague Convention (II) on the Laws and Customs of War on Land, and effectively guarantees a base level of protection under IHL even in the absence of specifically applicable treaties. According to HRW and the IHRC, fully autonomous weapons would be unable to comply with “principles of humanity” and “dicates of public conscience”—the Martens Clause’s two fundamental pillars.

A few months later, HRW and CSKR probed this idea of public conscience further, releasing results from a market research study on the strategic, legal and moral implications of LAWS. The study found that 61 percent of adults surveyed across 26 countries oppose LAWS—a 5 percent increase from survey results in 2017. Moreover, a majority of survey respondents in 20 of these countries expressed disapproval of LAWS, including those in countries whose governments have opposed a preemptive ban. Accordingly, CSKR concluded that “public opinion is in line with [CSKR’s] call for action to prevent the development of killer robots.” Although these surveys do not directly inform analyses under international law (unless, as HRW and the IHRC suggest, they contribute to an understanding of the “dictates of public conscience”), they do provide an interesting proxy for how opinio juris—a state’s belief that something is legally obligatory—is developing with respect to LAWS.

Apart from HSW and CSKR’s efforts, at the Paris Peace Forum marking the 100th anniversary of the end of World War I, U.N. Secretary-General Antonio Guterres explicitly called for a ban on LAWS, stating, “Imagine the consequences of an autonomous system that could, by itself, target and attack human beings. I call upon States to ban these weapons, which are politically unacceptable and morally repugnant.” And in mid-February, at the American Association for the Advancement of Science’s annual meeting, participants expressed dissatisfaction with the GGE’s overall progress. In particular, CSKR declared its intention to refocus its advocacy efforts domestically given the relative inaction and “diploma[tic] ... fail[ures]” at the international level.

Meanwhile, in the private sector, LAWS have garnered significant attention as well. In June 2018, Google came under fire as thousands of its employees signed a petition urging the company to cease involvement in Project Maven—a contract with the Department of Defense to develop artificial intelligence for analyzing drone footage (which Google employees feared could one day facilitate the development or use of LAWS). Facing pressure from employees and technology experts across the globe, Google subsequently announced its decision not to renew its contract for Project Maven and vowed not to “design or deploy AI … [for] technologies that cause or are likely to cause overall harm.” In July 2018, over 200 organizations and 3,000 individuals (including Elon Musk, Google DeepMind’s founders and CEOs of various robotics companies) followed suit, pledging to “neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons.” In light of these highly publicized events, the Defense Department recently tasked the Defense Innovation Board (comprising high-profile Silicon Valley tech leaders) with developing ethical principles to guide the department’s use of AI in military weapons and operations. The board has already concluded its first meeting and plans to publicly release its recommendations this June.

Highlights from the GGE’s August 2018 Meeting

While members of the private and public sectors have started to take concrete actions against LAWS, the same cannot be said of the GGE, despite increasing opposition to such weapons.

By the time the GGE met last August, 26 states supported a ban on fully autonomous weapons systems—four more than at the April 2018 meeting. However, 12 states—including Russia, the U.S. and the U.K.—opposed even negotiating a treaty on LAWS.

In advance of the August meeting, eight states submitted working papers. Though the papers discussed a wide variety of issues—ranging from the proper terminology and characterizations of LAWS to suggested approaches for regulating their development and use—the most commonly discussed issue concerned the concept of meaningful human control. While multiple papers reiterated the importance of holding humans accountable for their decisions to develop and deploy LAWS, some states expressed differing views on the proper way to conceptualize human control. For instance, whereas Brazil viewed human control as inextricably tied to the weapon’s level of autonomy, the U.S. sought to refocus the debate on human “judgment,” arguing that the key question is not the extent of control a human retains over the weapon, but whether “machines [can] effectuate the intention of commanders” and “enable personnel to exercise appropriate levels of judgment over the use of force.” According to the U.S., fewer opportunities for human control (and higher degrees of automation) can lead to greater alignment between human intentions and actual outcomes. Meanwhile, France appeared to express a middle-ground view, acknowledging that autonomy can improve the decision-making process, but expressing concern with operators’ ability to take charge of LAWS given their potentially inexplicable and unpredictable nature. States expressed a similar variety of positions during the meeting itself, and the GGE ultimately decided to continue these discussions at the next meeting. (Ljupčo Jivan Gjorgjinski, the chairman for the 2019 meeting, has specifically included discussions on human control and human-machine interactions in the March agenda.)

In an effort to convert their discussions into action items, states and organizations also proposed three main avenues to address the future development and use of LAWS. On one end of the spectrum, Austria, Brazil and Chile urged the GGE to “negotiate a legally-binding instrument” to address LAWS. The majority of delegations favored this option, with some states and organizations renewing their support for a ban, while others advocated for some degree of regulation (albeit in an unspecified form). Notwithstanding this widespread support, five states—the U.S., Russia, Australia, South Korea and Israel—effectively quashed further conversations on the matter. (As CSKR noted in its discussion of the survey mentioned above, these countries’ opposition to negotiating a legally binding instrument is particularly interesting given that—with the exception of Israel—a majority of survey respondents in each of these countries oppose “the use of [LAWS] in war.”) On the other end of the spectrum, a number of states—including Australia, the U.K. and Argentina—proposed continuing discussions “of existing obligations under international law” and elucidating best practices under IHL, specifically under Article 36 of the First Additional Protocol to the Geneva Conventions.

As a third, intermediate approach, Germany and France suggested a political declaration to formally express areas of consensus and elaborate guiding principles regarding human control and accountability. At least 10 states’ delegations voiced support for this option, with some (such as Spain and Sri Lanka) viewing it as a stepping stone toward restrictions on LAWS, and others (such as Ireland and Poland) expressing general interest in the idea. By the end of the August meeting, the GGE voted to include a fourth and final option in its report—namely, a recognition that “no further legal measures were needed” since “IHL is fully applicable to potential [LAWS].” However, the GGE ultimately kicked the can down the road, recommending that it meet in 2019 under the current mandate, declining to formally adopt any of the proposed measures.

The August meeting was notable for two final reasons. First, according to commentaryon the meeting by Reaching Critical Will (the disarmament division of the Women’s International League for Peace and Freedom and a frequent commentator on CCW meetings), the U.S. and Russia shocked other members of the GGE by doubting the relevance of international human rights law to autonomous weapons systems—even though prior GGE meetings appeared to take the applicability of such law as a given. In response, multiple states—such as Costa Rica, Panama, China and Cuba—pushed back, proposing a variety of solutions ranging from maintaining an explicit reference to international human rights law to mentioning the U.N. Charter. The GGE’s report—per the recommendation of China—“affirmed that international law, in particular the United Nations Charter and [IHL] as well as relevant ethical perspectives, should guide the continued work of the Group.” Second, much of the GGE’s debate centered on broader messaging concerns. Whereas some states, like the U.S., urged the GGE to discuss the benefits of LAWS (such as the capacity for greater targeting precision and less collateral damage), others fervently opposed any mention of such benefits absent an accompanying explanation of the associated risks. Similarly, a handful of states stressed the importance of “avoid[ing] the image that states believe” LAWS “are already in operation”—or “that these systems will be in operation one day.

Highlights from the High Contracting Parties’ November 2018 Meeting

Following the GGE’s August 2018 meeting, “all CCW States parties” convened for the Meeting of the High Contracting Parties to the CCW Nov. 21–23, 2018. Since the Convention and its Protocols cover all sorts of weapons and weapons systems, only a fraction of the November meeting dealt specifically with LAWS. But there were two developments of note. First, the International Committee of the Red Cross submitted a working paper prior to the meeting, which recommended that states develop an understanding of human control—a focus dictated by law and ethics—and provided questions to help inform the development of a practical understanding of the concept. Second—and perhaps most importantly—El Salvador and Morocco each called for a LAWS ban during the meeting, raising the number of states officially in support of a ban from 26 to 28.

Looking Ahead

So what to expect this March? According to Chairman Gjorgjinski, the “IHL prism” is the name of the game: IHL principles will “permeate all areas of [the GGE]’s focus. While the tentative agenda does not include general debate, it does provide for discussions on the impact of LAWS on IHL with a premium on “precis[ion] and specific[ity].” But building consensus may prove especially difficult this time around—unlike previous meetings, which lasted for 10 days, the GGE will meet for just seven days this year, only five of which will involve substantive debate. And as CSKR pointsout, all it takes is “one state [to] block agreement sought by the rest,” an outcome that may be all the more likely given the highly condensed opportunities for meaningful discussion.

On March 8, stakeholders will submit working papers to the CCW (which can be found here). If past is prologue, these working papers will set the tone for the March meeting as states and organizations stake out their positions on various topics and identify likely pressure points in the upcoming debate.

No comments:

Post a Comment