Killer Robots and the Laws of Man: Who’s to Blame for Mission Malfunction?

Awards

The following opinion piece was written by Bonnie Docherty, who is a senior researcher at Human Rights Watch and a lecturer on law at Harvard Law School. She is the author of Mind the Gap: The Lack of Accountability for Killer Robots, a report by Human Rights Watch and Harvard Law School’s International Human Rights Clinic.

Such fully autonomous weapons, or “killer robots,” are under development in several countries. But the robots’ use of force would undermine the fundamental legal and moral principle that people should be held responsible for their wrongdoing.


Countries and nongovernmental groups around the world have been working for two years now to figure out how to deal with these weapons before they are in production. In April, representatives from 90 countries met at the United Nations in Geneva for their second round of talks on what to do about “lethal autonomous weapons systems.”


Accountability was a major theme at this year’s talks. It helps deter future violations, provides retribution for victims, and reflects social condemnation of the unacceptable act. But it would be sorely lacking when it comes to future weapons that would select and engage targets on their own.


Fully autonomous weapons would also raise a host of other concerns. These weapons would make their own determinations about who should be killed, a prospect that many people find repugnant. It would be difficult to make these weapons comply with international law’s protections of civilians in armed conflict and other situations such as border control and law enforcement operations. The weapons present a significant risk of an arms race and proliferation to rogue states and non-state armed groups.


A report I wrote in advance of last month’s meeting for Human Rights Watch and Harvard Law School’s International Human Rights Clinic, Mind the Gap: The Lack of Accountability for Killer Robots, helped generate discussion about accountability. We found that in most cases no one would be held legally responsible for the actions of a fully autonomous weapon. The programmer, manufacturer, commander, and operator would all escape liability.


Under international criminal law, a commander might be found guilty of intentionally misusing a robot to kill civilians, but the commander could not be held legally responsible if the weapon acted in an unanticipated way. Such situations would likely occur with a weapon that lacked meaningful human control.


And even if a commander had warning that a weapon was going astray, he or she might be

unable to prevent it. Fully autonomous weapons are frequently touted as militarily beneficial because of faster-than-human processing speeds, but these speeds would interfere with a commander’s ability to stop them midstream.


If you could not hold someone responsible under criminal law, could the victim still sue under civil law? A responsibility gap would likely exist under this body of law as well. At least in the United States, the military and military contractors are generally immune from suit. Even if immunity could be overcome, the people bringing suit would find it difficult to prove liability in a case involving highly advanced technology.


An alternative approach proposed for other forms of autonomous technology, such as autonomous cars, is a no-fault compensation scheme. Under such a scheme, victims would need to prove only that they were harmed, not that the harm was foreseeable or the result of a product’s defect. While financial compensation could benefit victims of fully autonomous weapons, it would not close the accountability gap. The absence of a finding of fault would undercut deterrence, retribution, and social condemnation.


The best solution is a pre-emptive ban on the development, production, and use of fully autonomous weapons. A number of countries have joined Human Rights Watch, the Campaign to Stop Killer Robots, and others in calling for such a prohibition. A ban is easier to enforce than regulation, and increases stigmatization, a powerful tool in international law.


Merely regulating the weapons also leaves room for misuse. Once the weapons entered their arsenals, countries could be tempted to use them inappropriately, as has been the case with other widely condemned weapons, such as cluster munitions.


A prohibition should be adopted now because the more nations invest in the technology, the less willing they will be to give it up. By ensuring that all weapons retain meaningful human control, countries could help protect the principle of personal accountability on the battlefield and the police beat.

Avg Gross Earnings at HUF 605,400 in February 2024 Figures

Avg Gross Earnings at HUF 605,400 in February 2024

Bulgaria's Household Income, Spending Rise 20% in 2023 World

Bulgaria's Household Income, Spending Rise 20% in 2023

Hungary, UAE Sign Defense Industry Agreement Deals

Hungary, UAE Sign Defense Industry Agreement

Hungary Launches HUF 15 bln Tourism Sector Support Program Tourism

Hungary Launches HUF 15 bln Tourism Sector Support Program

SUPPORT THE BUDAPEST BUSINESS JOURNAL

Producing journalism that is worthy of the name is a costly business. For 27 years, the publishers, editors and reporters of the Budapest Business Journal have striven to bring you business news that works, information that you can trust, that is factual, accurate and presented without fear or favor.
Newspaper organizations across the globe have struggled to find a business model that allows them to continue to excel, without compromising their ability to perform. Most recently, some have experimented with the idea of involving their most important stakeholders, their readers.
We would like to offer that same opportunity to our readers. We would like to invite you to help us deliver the quality business journalism you require. Hit our Support the BBJ button and you can choose the how much and how often you send us your contributions.