Researcher: War robots are not illegal

New legal research concludes that weapons systems with autonomous attack capabilities capable of selecting and engaging a target without human interference are not illegal in and of themselves. However, a political decision should be made regarding their use, says the researcher.

They are not illegal in and of themselves. But they are potentially extremely problematic, and a political decision should be made regarding their use if the international law for armed conflict (Law of Armed Conflict (LOAC)) is to be observed. This is the conclusion of a new PhD dissertation recently defended by PhD Iben Yde from the Department of Law at Aarhus BSS.

The dissertation’s focal point is weapons systems which possess a degree of autonomy enabling them to select and engage targets without human interference. This could be e.g. missile defence systems, sophisticated ammunition types and the combat drones of the future, submersible vehicles and tanks.

Strong opposition to war robots

In 2013, Human Rights Watch (HRW) kicked the debate regarding the legality of autonomous weapons systems into high gear with the campaign “Stop Killer Robots”.

According to, among others, HRW and the International Committee for Robot Arms Control (ICRAC), such weapons are illegal - partly under international law, as e.g. target selection must be controlled by humans, not robots, and partly because it is amoral and problematic in relation to human dignity that a decision to kill can be “outsourced” to a robot.

However, Iben Yde disagrees with a number of HRW’s and ICRAC’s legal conclusions:

“Although there are a number of valid moral and political arguments against these weapons, there’s currently no legal basis for simply declaring them illegal under international law. This is important to emphasise, since the discussion often ends up in a muddled mishmash between politics, law, morality and ethics. Instead, we should discuss under what circumstances and limitations it is acceptable to use these systems,” explains Iben Yde.

Politicians must commit

According to Iben Yde, it is important to begin a discussion on the political level about the use of these systems and, in particular, about whether new rules are needed to regulate their use.

“The longer we wait, the greater the financial and political interests associated with autonomous weapons systems will be, which will make it increasingly difficult to implement regulation in the area, if that’s what you want. There are many interests at stake, both in the weapons industry and in military systems all over the world, because in many cases autonomous weapons systems allow you to avoid deploying flesh and blood soldiers and, in the long run, achieve significant savings in personnel,” says Iben Yde.

Rapid weapons development

Israel, the US, Russia and many other nations increasingly use weapons with a growing degree of autonomy. But it still takes place in a context where the weapons are operated by remote control or programming and human supervision, and as such they are not yet autonomous. However, according to Iben Yde, the development within artificial intelligence and other technologies that enable the development of weapons systems with an increasing degree of autonomy is progressing faster than we can imagine.

“We see the fastest progress in the air and under water. There, the physical obstacles are less significant than on land, where the operational environment is infinitely more complex and the risk of accidents therefore much greater. There is no question that the progress is almost unstoppable. This again underlines the need for clear and international rules,” says Iben Yde, and adds:

“I hope my research can contribute to a better understanding of the legal problems with respect to international law concerning these weapons. They’re not illegal in and of themselves, but their use will be illegal in a number of situations, as the current technology isn’t advanced enough to live up to the rules of international law on target selection and engagement. That’s why autonomous weapons systems are currently required to be under human control.”

About the research:

  • Iben Yde’s dissertation is titled: “The Legal Implications of Weapons Systems with Autonomous Attack Capabilities - Towards an Understanding of the Changes in Human Influence on Target Selection and Engagement”
  • She defended her PhD dissertation facing a legal assessment committee consisting of Dr. William Boothby from the Geneva Centre for Security Policy, Professor Frederik Harhoff from the University of Southern Denmark and Professor Jens Vedsted-Hansen from Aarhus BSS - Aarhus University.
  • Iben Yde is currently employed at Defence Command Denmark. In the above article, her comments are made solely on the basis of her research at Aarhus University.

 

 

Further information

PhD Iben Yde
Department of Law
Aarhus BSS - Aarhus University

Email: iy@law.au.dk
Tel.: +45 26849420