7 Feb 2020

Killer robot ethical and legal debate under the microscope

From Afternoons, 1:27 pm on 7 February 2020

First there was gunpowder, then nuclear weapons, now another revolution in war fighting is underway with the rise of algorithmic warfare, using robots and precision targeting technology.

Associate Professor Amy Fletcher

Associate Professor Amy Fletcher Photo: supplied

University of Canterbury researchers have been awarded a Marsden Fund grant to examine the current debate on the use of ‘killer robots’ in warfare.

Researcher Associate Professor Amy Fletcher says they hope their analysis and understanding of the issues will help to shape future regulations for conflict using the autonomous weapons.

While killer robots may sound like science fiction, Fletcher says the artificial intelligence doesn’t have to be at the level of The Terminator to be incredibly precise, lethal and dangerous.

“The key question becomes, even if it’s not a killer robot as per Hollywood, what about those weapons that can potentially take a human out of the decision loop and make an autonomous decision to take that kill shot?”

We’re not set up ethically to really deal with the ramifications of this, she says.

“We’re already seeing divisions among some of the key nation states that are trying to develop this technology.”

There’s about 28 countries that are currently openly seeking a pre-emptive ban, she says, while other nations see the benefit of the AI.

Although the project is in its early days, an incredibly important argument being made is that warfare between robots, removing the element of humans all together, would decrease human casualty, she says.

“At the same time though, the debate does stay quite complex because we know that most theatres of war today are not between two parallel nation states, they’re happening between incredibly technologically sophisticated nation states and much more poor, developing countries.”

Robots don’t get tired, make snap decisions or get nervous, she says, although this technology is a while off yet.

What is in development currently though, are autonomous precision weapons, like the ‘suicide drone’ from Russia.

“It’s meant to choose a target and then blow itself up once it gets to a certain proximity to the target, destroying itself but then everything in the area. The explicit goal is to take the human out of that loop.”

The psychological impact of drones can be devastating to populations that have to live with them, she says, regardless of whether the shot is taken.

These are important considerations to discuss as well, she says.