SUBSCRIBE NOW

SIGHT

Be informed. Be challenged. Be inspired.

Red Cross sounds alarm over use of ‘killer robots’ in future wars

Thomson Reuters Foundation

Countries must agree strict rules on “killer robots” – autonomous weapons which can assassinate without human involvement, a top Red Cross official has said, amid growing ethical concerns over their use in future wars.

Semi-autonomous weapons systems from drones to tanks have for decades been used to eliminate targets in modern day warfare – but they all have human control behind them.

With rapid advancements in artificial intelligence, there are fears among humanitarians over its use to develop machines which can independently make the decision about who to kill.

Yves Daccord, director-general of the International Committee of the Red Cross, said this would be a critical issue in the coming years as it raised ethical questions on delegating lethal decisions to machines and accountability.

“We will have weapons which fly without being remotely managed by a human and have enough intelligence to locate a target and decide whether it is the right person to take out,” Daccord told the Thomson Reuters Foundation in an interview.

“There will be no human making that decision, it will be the machine deciding – the world will essentially be delegating responsibility to an algorithm to decide who is the enemy and who is not, and who gets to live and who gets to die.”

The ICRC initiated the international adoption of the four Geneva Conventions that lie at the core of international humanitarian law in 1949.

Since then, it has urged governments to adapt international humanitarian law to changing circumstances, in particular to modern developments in warfare, so as to provide more effective protection and assistance for conflict victims.

A global survey published by Human Rights Watch and the Campaign to Stop Killer Robots, a global coalition of NGOs, on Tuesday found six out of ten people polled across 26 countries oppose the development of fully autonomous lethal weapons.

The study, conducted by Ipsos, surveyed 18,795 people in 26 countries including Brazil, India, the United States, Britain, China, South Africa, Japan and Israel.

Daccord said autonomous weapons crossed a moral threshold as machines did not have the human characteristics such as compassion necessary to make complex ethical decisions.

They lacked human judgment to evaluate whether an attack was a proportional response; distinguish civilians from combatants, and abide by core principles of international humanitarian law, he added.

The issue of “killer robots” has divided humanitarians.

The United Nations Secretary-General Antonio Guterres has called for a complete ban, while other organisations such as the ICRC are advocating for strict regulation.

“We should not go for banning, but I am of the opinion that we have to keep a level of human control over such weapons. This means that, at any time of the operation, a human can intervene,” said Daccord.

“There are no guidelines regarding their use and they have not even been defined yet, so we have to create a common grammar between states and develop guidelines, or treaty law.”

The rules would address issues such as the definition of autonomous weapons, the level of human supervision over these weapons such as ability to intervene and deactivate, as well as the operational conditions for their use, says the ICRC.

Supporters of autonomous weapons argue they will make war more humane. They will be more precise in determining and eliminating targets, not fall prey to human emotions such as fear or vengeance and will minimise civilian deaths, they say.

But Daccord said such machines could malfunction, and this raised questions over who would be held responsible.

“You can hold people accountable under international humanitarian law with remotely managed weapons such as drones. With autonomous weapons, we are moving into new territory,” he said.

“There is a process under way, but we have to get countries together to agree on a common text which is not easy. It’s better they start to negotiate now and find an agreement than wait for a major disaster.”

 

Donate



sight plus logo

Sight+ is a new benefits program we’ve launched to reward people who have supported us with annual donations of $26 or more. To find out more about Sight+ and how you can support the work of Sight, head to our Sight+ page.

Musings

TAKE PART IN THE SIGHT READER SURVEY!

We’re interested to find out more about you, our readers, as we improve and expand our coverage and so we’re asking all of our readers to take this survey (it’ll only take a couple of minutes).

To take part in the survey, simply follow this link…

Leave a Reply

Your email address will not be published. Required fields are marked *

For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.