Pages

13 February 2021

Command Accountability for AI Weapon Systems in the Law of Armed Conflict

James Kraska, U.S. Naval War College

The use of artificial intelligence (AI) in weapon systems enhances the ability of operational forces to fuse multispectral sensors to understand the warfighting environment, positively identify, track, and select targets, and engage them with the most appropriate effects. The potential for AI to help close the “kill chain” has raised concern that this creates a gap in accountability between the decisions of humans and the acts of machines, with humans no longer accountable for decisions made during armed conflict. This study suggests that there is no gap because the military commander is always directly and individually accountable for the employment of all methods and means of warfare. The commander’s military accountability pervades the battlefield. This accountability inures to the force structure, weapon systems and tactics used in war, including the use of AI weapon systems. Military accountability is the foundation of military duty and includes the legal obligation to comply with the law of armed conflict or international humanitarian law. The commander is accountable to superior military and civilian leaders, and is subject to political, institutional, and legal sanctions enforced through military order and discipline, including the Uniform Code of Military Justice. The doctrine of the commander’s direct and individual accountability ensures that senior military leaders are answerable to and liable for breaches of law and leadership, including oversight, selection, and employment of autonomous weapon systems.

No comments:

Post a Comment